Work History

Robin

Principal Software Engineer, Data & Insights | March 2022 – Present

  • Drove the end-to-end revitalization of a long-neglected customer-facing analytics product that lacked clear ownership and investment. Assessed the existing architecture, triaged legacy components, and prioritized high-impact improvements: rebuilt a fragile Go-based export service as a reliable Node.js app with tests and monitoring; embedded Sigma to replace bespoke visualizations and enable self-serve custom dashboards; and designed performant, intuitive data models tailored to lightly technical users. Directed the work of other engineers, providing technical mentorship and architectural guidance. These efforts unlocked critical roadmap features and repositioned the product as a strategic differentiator.

  • Led the strategic, zero-downtime migration of the company’s customer-facing analytics warehouse from AWS Redshift to Snowflake — the foundation of our differentiated analytics product. A failed migration would have derailed the product roadmap and hindered our ability to compete in the marketplace. Delivered the transition seamlessly, dramatically boosting query performance and developer velocity without increasing operational costs.

  • Conceived, prototyped, and led development of Talk To Your Data (externally referred to as the AI Assistant), a natural language analytics platform enabling users to query company data conversationally using LLMs. The proof of concept won the company-wide hackathon unanimously and was greenlit for productization. As lead engineer and architect, I designed the system’s multi-agent, supervisor-driven architecture, guided and coordinated the work of multiple engineers, and collaborated with Product and Design to prioritize features and manage delivery timelines.

Rhino

Data Engineering Manager | June 2021 – February 2022

  • Hired to manage and grow a team of data engineers under the Director of Data; operated as a hands-on technical lead while hiring was paused due to a company-wide layoff.

  • Redesigned the company’s data pipeline architecture, implementing containerized dbt deployments via Apache Airflow, reducing pipeline runtime from 45 minutes to under 5 minutes and significantly improving reliability, observability, and developer workflow.

  • Partnered closely with the Risk team to support regulatory reporting, working to understand compliance requirements, translate them into technical checks, and ensure accurate outputs that held up under scrutiny from external stakeholders.

  • Collaborated across the Product, Engineering, Accounting, and Risk teams to align data models and reporting with business needs and regulatory constraints.

The MITRE Corporation

Senior Data Engineer | March 2020 – June 2021

  • Served as the sole data engineer on a highly cross-functional team building secure data infrastructure for government systems in AWS GovCloud. Operated with significant autonomy and trust to deliver performant, production-grade solutions in a regulated environment.

  • Designed and implemented an event-driven, serverless ETL platform using AWS Lambda, Glue, and ECS, enabling scalable ingestion and transformation of sensitive data across multiple pipelines.

  • Rewrote and optimized key ETL workflows, cutting runtime by over 50% and improving reliability and maintainability under strict compliance requirements.

  • Built Python tooling to profile and validate data from third-party REST APIs and supported ad hoc analysis using AWS Athena against S3-based datasets in a lightweight data lake architecture.

DAS42

Associate Analytics Consultant | July 2018 – March 2020

  • Designed and delivered a full-stack ELT and analytics platform for a major e-commerce client, enabling near real-time insights into the sale of over 100 million units across multiple product categories.

  • Engaged directly with stakeholders ranging from engineers to C-suite executives to translate strategic business questions into data models, dashboards, and reporting pipelines.

  • Led the development of a modern data stack using Apache Airflow, Python, Snowflake, dbt, and Looker to centralize customer data and power key operational and marketing analytics.


Skills & Technologies

Data Platforms & Tooling: Snowflake, BigQuery, ClickHouse, Redshift, PostgreSQL, dbt, Apache Airflow, Singer

Cloud & Infrastructure: AWS (Lambda, ECS, RDS, Redshift, Glue), GCP, Apache Kafka, Debezium, Terraform, Pulumi, CDK

Programming & Workflow Automation: Python, SQL, JavaScript/TypeScript, Docker, GitHub Actions

Applied LLMs & AI Infrastructure: LangChain, LangGraph (used in production-facing natural language analytics platform), Gemini, AWS Bedrock, OpenAI

Business Intelligence: Looker, Sigma, Tableau


Education

M.S. Business AnalyticsUniversity of Colorado Boulder
Focused on data infrastructure, machine learning, and applied analytics

B.A. ArchaeologyThe George Washington University