Projects
Talcott Resolution 2023 – Present
Financial Data Validation & Reconciliation Platform
Downstream reporting depended on data from multiple systems that frequently diverged due to timing differences, schema mismatches, and vendor inconsistencies.
I built Python validation pipelines that compare datasets across Oracle, SQL Server, and vendor feeds, normalize schemas, and surface discrepancies before they reach reporting workflows. The system runs automatically and produces actionable outputs, allowing teams to quickly identify and resolve data issues without manual investigation.
Azure On-Demand Reporting Platform
Business users needed a way to generate operational reports on demand without relying on long-running batch jobs or permanently allocated compute.
I built a FastAPI service deployed in Azure Container Apps, exposing HTTP endpoints through ingress to trigger report-generation workflows. These endpoints launch Azure Container App Jobs that spin up containerized Python processes to generate downloadable XLSX reports for the business. This architecture cleanly separates lightweight API orchestration from heavier compute workloads, enabling scalable, on-demand report generation across environments.
Metadata-Driven Ingestion Framework
Adding new data sources previously required writing custom ingestion logic for each format, slowing onboarding and increasing maintenance overhead.
I designed a metadata-driven framework where each data source is defined by configuration rather than code. The system handles XML, JSON, and Excel inputs using shared processing logic, enabling rapid onboarding of new sources while maintaining consistent validation, transformation, and error handling across pipelines.
Trade & Settlement Logging System
Trade and settlement activity needed to be captured, validated, and persisted in a way downstream processes could depend on — no gaps, no duplicates, with clear auditability.
I built the backend API routes and database logic to log trade activity, applying validation and upstream enrichment at ingestion time. Added a React interface for the operations team to query and review settlement state without touching the database directly.
Optum 2020 – 2023
Enterprise Data Pipeline Development
Healthcare data arrived from multiple upstream sources in inconsistent formats, with no standardized validation layer before it reached analytics consumers.
I built and maintained ETL pipelines that ingested, transformed, and routed large healthcare datasets across internal systems. Focused on correctness at the transformation layer — enforcing business rules, surfacing upstream data quality issues early, and ensuring downstream analytics workflows received clean, reliable inputs.
Kafka Application Logging to Azure Event Hubs & ELK Stack
Application logs were scattered across services with no centralized way to monitor behavior, trace errors, or analyze patterns across systems.
I configured a Kafka-based logging pipeline that captured application events and streamed them to Azure Event Hubs. From there, logs were ingested into an ELK stack — Elasticsearch for indexing and storage, Logstash for transformation and enrichment, and Kibana for visualization. Built dashboards in Kibana to surface error rates, latency trends, and service health across environments.
Personal Projects
Trading Signal & Execution System
A Python system that monitors intraday market data and executes rule-based trades through Alpaca's brokerage API. The system pulls a morning briefing from a Discord channel, builds a watchlist, then monitors each ticker on a configurable candle interval — evaluating EMA crossovers and breakout conditions and submitting market orders with trailing-stop exits. PDT compliance is enforced automatically.
Built to understand the practical gap between signal generation and live execution: noisy data, timing constraints, and the difference between what a strategy looks like in backtesting versus what it does under real market conditions.
Market Data Analysis & Research Tools
Personal scripts for analyzing historical stock performance, screening intraday setups, and evaluating valuation metrics across watchlists. Work spans time-series pattern analysis, P/E screening, RSI-based momentum signals, and sentiment scoring from news feeds.