Forecasting • Causal Inference • Python / R / SQL / Stata
I work in applied economics with a focus on forecasting, causal inference, and data-intensive analysis. I'm currently completing an M.S. in Applied Economics at the University of Maryland, and I hold an MPA from the University of Montana.
Before moving into economics, I spent more than a decade running operations in hospitality. That background shaped how I think about uncertainty and decision-making, but my work today is grounded in the idea that economics is an abstraction: models don't need to mirror reality to be useful—they need to clarify it. What matters is understanding the assumptions, the limits, and the degree of confidence a model actually earns.
I build forecasting models and causal inference frameworks using Python, R, SQL, and Stata. My work includes Monte Carlo simulation, time-series decomposition, ensemble forecasting, and econometric designs such as difference-in-differences, event studies, and synthetic control.
My current flagship project examines how data center investment has driven regional electricity demand growth across PJM, ERCOT, and MISO. The design uses a four-method identification strategy — panel regression with wild cluster bootstrap inference, synthetic control, difference-in-differences with low-exposure controls, and narrative validation — to produce a defensible estimated range for the demand effect, while explicitly documenting why clean causal identification is not currently possible with public data alone. The analysis panel covers 84 months across three balancing authorities and feeds a forward-looking grid stress forecasting model.
Each project is built end-to-end: data construction, model design, diagnostics, and interpretation. The code is part of the deliverable, and every analysis is documented so the reasoning is visible and reproducible.
I rely on transparent assumptions, calibrated uncertainty, and methods that can be scrutinized and replicated. I treat modeling as an iterative process—build, test, stress, revise—until the results are stable and the limitations are clear.
I also maintain a track record of probabilistic forecasting through platforms like Good Judgment Open and Metaculus. Regular calibration against resolved outcomes keeps my intuition aligned with actual accuracy, not perceived confidence.
This site is a working portfolio of my analytical approach. It shows how I structure problems, how I build models, and how I handle uncertainty. The forecasts update, the methods are documented, and the code is public. The goal is clarity—so the work can be evaluated on its merits.
You can reach me through the contact form or on LinkedIn.