Case studies

ContentRX, applied to real OSS projects

Each case study is a real run of ContentRX against a public project's UI copy, with the maintainers' written approval, focused on the judgment calls a generic linter would miss — error messages that blame the user, permissions buttons that should be specific verbs, empty states that aren't encouraging.

This page stays honest about its state. When no study has shipped yet, it says so. When studies ship, they land here with per-finding critique and links to any resulting PRs on the project's repo.

Published case studies

None yet. The scaffolding for case studies (sessions 26–28) shipped first; the three published studies land as maintainer approval lands. See the shortlist below for the projects under consideration.

Candidate shortlist

The shortlist below is the output of tools/case_study_candidates.py, which scores every repo in external_signal/allow_list.json by the count of quality signals (content-designer presence, active i18n, content-design blog) and license permissiveness. Surfacing the shortlist publicly is part of the transparency commitment — anyone can see which projects are being considered and object before outreach starts.

Shortlist generated .

  1. supabase/supabase Actively-maintained docs with content-designer review.
    Signals: content designer · active i18n · content-design blog · license Apache-2.0 · score 1
  2. vercel/next.js Vercel's flagship repo with active content review.
    Signals: content designer · active i18n · content-design blog · license MIT · score 1
  3. mdn/content Mozilla Developer Network — documentation-first content.
    Signals: content designer · active i18n · content-design blog · license CC-BY-SA-2.5 · score 0.8367
  4. linear/linear Linear OSS artifacts — content reputation.
    Signals: content designer · content-design blog · license MIT · score 0.8165
  5. PostHog/posthog Product-analytics UI with strong content practice.
    Signals: content designer · content-design blog · license MIT · score 0.8165

How case studies get published

The publishing workflow is on Robo's side, not automated:

  1. Pick a candidate from the shortlist.
  2. Contact the maintainers via the repo's issue tracker or a public channel. Get explicit written approval before any evaluation runs against their strings.
  3. Run ContentRX against the project and draft the MDX at docs-site/app/case-studies/<slug>/page.mdx using the template at docs-site/content/case-studies/_template.mdx.
  4. Add an entry to the CASE_STUDIES registry with maintainer_approval: true, approved_by, and approved_at. The CI guard in scripts/check_case_study_approval.py rejects any registry entry missing those fields.
  5. Open the PR. Three judgment-calls minimum per study.

← The content model