I love building things! I am an experienced software engineer with a background in data science and several years of experience in the geospatial field. I love to look for ways to optimize processes and employ data to improve people's lives. I've always tried to gear my work towards the public good and protecting the environment. I've worked positions focused on keeping habitats safe through ocean floor mapping, expanding internet access to rural areas of the US, and using drones to deliver life-saving blood and vaccines to the hardest-to-reach places across 5 African countries.
- Software Development: Backend systems, algorithm optimization (Python, Rust, Go, R)
- DevOps and Cloud Computing: AWS (CDK, Serverless), CI/CD (Bazel, GitHub Workflows), IaC (AWS CDK)
- Data Science & Engineering: Predictive modeling, data pipelines, machine learning (Python, R)
- Geospatial Analysis: Managing and analyzing geospatial data (GDAL, GeoPandas, ArcGIS, QGIS)
- πΌ Backend Software Engineer (Geospatial) at Zipline
- π Master's in Data Science, University of Virginia
- π Bachelor's in Creative Technologies, Virginia Tech
- π Publications:
- βClimate and Human Mortality in Virginia,β Science of the Total Environment, Oct. 2023
- βHydrography from Fisheries Surveys,β The International Hydrographic Review, Nov. 2020
- More on my LinkedIn
- Software Development
- Backend Development: Strong background in developing and optimizing backend systems and services
- Algorithm Optimization: My role at Zipline is in "Core Optimization" where I focus on improving algorithm performance through various methods of optimization
- Languages:
- Skilled: Python, Rust
- Competent: R, Bash, Starlark (Bazel)
- Knowledgeable: JavaScript, Go
- DevOps and Cloud Computing
- Experienced with building and deploying scalable cloud-based services and data pipelines, and supporting CI/CD pipelines
- Extensive experience with AWS services including CDK, Serverless, Lambda, Step Functions, EKS, RDS, S3, and ECR
- Strong background in optimizing build systems with Bazel and Docker
- CI/CD: Skilled in setting up and managing continuous integration and deployment pipelines using tools like Bazel and GitHub Workflows
- Infrastructure as Code (IaC): Knowledgeable in PaaS and IaC, specifically AWS CDK
- Data Science and Data Engineering
- Predictive Modeling: Skilled in building predictive models and conducting statistical analyses
- Data Pipelines:
- Proficient in building and optimizing cloud-based data pipelines
- Expertise in cleansing, wrangling, visualization, modeling, and interpretation of data
- Machine Learning:
- Experienced with deep learning, neural networks, and data preprocessing
- A solid comprehension of AI concepts, algorithms, and frameworks
- Tools:
- Python: Pandas, PySpark, Dask, Numpy, Scikit-learn, SciPy, XGBoost, TensorFlow, PyTorch, Keras
- R: Dplyr, Caret
- Databases: PostgreSQL, NoSQL, MySQL, MongoDB
- Geospatial Data Analysis and Systems
- Expert in geospatial analysis, with extensive experience in managing and analyzing geospatial data in data-heavy, cloud-based environments
- Designed and implemented scalable architectures for raster image processing and vectorization in cloud environments
- Implement algorithms for image processing, feature extraction, and vectorization
- Tools:
- Raster Data: GDAL / OGR, Satellite Imagery
- Python: GeoPandas, Rasterio, Xarray, OpenCV, Pillow, ArcPy
- GIS: ArcGIS, QGIS
- Data Management: PostgreSQL + PostGIS, HDF5,
- Other: ArcGIS, QGIS, Google Earth Engine, Georeferencing