Learn by Directing AI
Unit 6

Project close

Step 1: Review project output

Before closing, inventory what you have produced:

  • A data quality assessment with classified findings and business impact
  • Metric definitions for retention rate, revenue per member, and class attendance -- each with plain language and SQL
  • Automated validation tests for each metric
  • A Metabase dashboard with information hierarchy, accessibility, and durable design
  • Documentation of every design decision and trade-off

Check that all of these exist in your project folder and are committed.

Step 2: Write the README

Write a README for the project repository. Someone who has never seen this project should be able to:

  1. Understand what was built and for whom
  2. Start the dashboard (Docker and Metabase startup)
  3. Know what each metric means and how it is defined
  4. Find the quality assessment and metric definitions
  5. Run the validation tests

Include the retention rate definition explicitly -- this is the metric that changes depending on how you define it. Anyone maintaining this dashboard needs to know which definition you chose and why.

Step 3: Run final tests

Run all metric validation tests one final time. Every test should pass with the current data state. If any test flags something you have not investigated, investigate it now.

These tests are a living contract. When Diego's team adds data from the fifth location next year, these tests will catch if the new data violates the metric definitions. That is the point of automated governance -- it continues working after you leave.

Step 4: Commit and push

Commit any remaining changes with a descriptive message. Push to GitHub.

Commit all remaining changes and push to GitHub. Use a commit message that describes what was delivered.

Step 5: Final email to Diego

Send Diego a final email summarizing what was delivered:

  • The dashboard: where to find it, how to start it, what the primary KPI is
  • The metric definitions: retention rate (with the definition you chose and why), revenue per member (adjusted for pricing tiers), class attendance
  • The quality assessment: what was found, what was fixed, what to watch for
  • Recommendations for the fifth location: what will need updating, what the validation tests will catch automatically

Write in Diego's terms, not technical terms. He does not need to know about DuckDB or SQL mode. He needs to know that his Monday morning dashboard is ready and what the numbers mean.

✓ Check

Check: Does your README explain how someone who has never seen this project would start the dashboard and understand what each metric means?

Project complete

Nice work. Ready for the next one?