Step 1: Translate the statistical findings
The regulation's effect estimate and confidence interval need to be stated in terms policymakers can act on.
Not: "The coefficient for the post-regulation indicator is -2.3 (p = 0.04)."
Instead: "PM2.5 levels in Stockholm decreased by approximately 2.3 micrograms per cubic meter after the regulation, with a 95% confidence interval of 1.1 to 3.5 micrograms. This means we can be fairly confident the regulation is associated with a real decrease, though the size of the effect could range from modest to substantial."
The confidence interval as a range is the actionable statement. Policymakers can plan around "the decrease is between 1.1 and 3.5 micrograms." They cannot plan around "p = 0.04."
Do this for each city. The effects may differ -- Stockholm's traffic density is different from Malmo's. If one city shows a stronger effect, that is informative for the agency.
Step 2: Address Astrid's requirements
Walk through each of Astrid's five requirements and verify the analysis addresses it:
- Quantify the change in PM2.5 in three cities -- do you have effect estimates and confidence intervals for each city?
- Separate the regulation's effect from confounders -- does the model control for seasonal variation, weather effects, and long-term trends?
- Report uncertainty honestly -- are the confidence intervals in the report? Are the limitations stated?
- Make the analysis reproducible -- can another NERI researcher run the notebook and get the same results?
- Present findings for the regulatory agency -- is the report in a format suitable for government use?
Step 3: Prepare the findings report
Structure the report for the regulatory agency: executive summary, methodology description, city-by-city findings with confidence intervals, and a limitations section.
The limitations section belongs in the body of the report, not in an appendix. Policymakers who skip the appendix skip the caveats. If the analysis controlled for weather and season but not for changes in traffic patterns or industrial activity, say so in the findings section where the numbers are presented.
If you addressed Astrid's blind spot about the comparison pollutant -- showing that ozone (not targeted by the regulation) did not change over the same period -- include it. That comparison strengthens the case: PM2.5 decreased while ozone did not, suggesting the decrease is specific to the regulated pollutant rather than a general air quality trend.
Step 4: Prepare the technical documentation
The analysis must be reproducible. Another researcher at NERI should be able to open the project and replicate the results.
The reproducibility package is: the methodology memo (documenting every decision), the Jupyter notebook (with the analysis), and the CLAUDE.md file. That last one is new -- the project memory file is part of the reproducibility infrastructure. It tells the next analyst what conventions to follow, what data quality issues are known, and what the analytical constraints are. Without it, the next analyst starts from scratch.
Step 5: Send the deliverable to Astrid
Send Astrid the findings report, the technical appendix, and a covering message. The message should summarize the key findings in two or three sentences and note where she can find the detailed methodology.
Astrid takes longer than usual to respond. When she does: "This is well done." That is high praise from Astrid. She notes the comparison pollutant analysis if you included it -- or asks about it if you did not.
Check: Findings translated to policymaker language. All five Astrid requirements addressed. Confidence intervals stated as actionable ranges. Limitations in report body not appendix. Reproducibility package complete (methodology memo, notebook, CLAUDE.md).