Skip to content

Auto-Generating Docs with Cursor Agent

Your open-source library just hit 10,000 stars on GitHub. The most upvoted issue? “Please add documentation.” The README has a single code example from two years ago, there are no API docs, the architecture is entirely in the lead developer’s head, and a major v2.0 release ships next week with 40 breaking changes. You need comprehensive docs — API reference, getting started guide, migration guide, architecture overview — and you need them before the release.

Writing documentation from scratch is one of the most tedious tasks in software development. Reading documentation that someone else should have written is one of the most frustrating. Cursor bridges this gap by reading your codebase and generating accurate, structured documentation that you can then refine with human judgment. The AI handles the mechanical extraction — function signatures, parameter types, return values, example generation — while you handle the editorial work of making it clear and useful.

  • A codebase analysis prompt that identifies every public API and generates JSDoc/TSDoc annotations
  • An architecture documentation prompt that produces Mermaid diagrams from code structure
  • A getting-started guide generator that creates step-by-step tutorials from working tests
  • A migration guide prompt that diffs two versions and generates a breaking changes document
  • A documentation freshness workflow that detects when docs fall out of sync with code

Before generating anything, understand what exists and what is missing. Use Ask mode for a gap analysis.

Step 2: Generate JSDoc annotations for the public API

Section titled “Step 2: Generate JSDoc annotations for the public API”

The fastest documentation win is adding inline documentation to your source code. Agent mode can read function implementations and generate accurate JSDoc comments.

Step 3: Generate architecture documentation

Section titled “Step 3: Generate architecture documentation”

Architecture docs rot faster than any other type of documentation because they require understanding the system as a whole, not just individual functions. The trick is generating them from the code itself so they can be regenerated when the architecture changes.

@src
Generate architecture documentation at docs/architecture.md:
1. System Overview: A Mermaid diagram showing the major modules and their dependencies
2. Data Flow: How data moves through the system from input to output
3. Key Abstractions: The 5 most important interfaces/types and what they represent
4. Extension Points: Where users can customize or extend the library (plugins, hooks, callbacks)
5. Design Decisions: Why the code is structured this way (infer from the patterns -- why does it use a pipeline architecture? why are there separate parse and transform phases?)
6. Performance Characteristics: Big-O complexity of key operations, memory usage patterns
Use Mermaid for all diagrams. Keep the document under 1000 words -- architecture docs that
nobody reads are worse than no architecture docs.

Step 4: Create a getting started guide from tests

Section titled “Step 4: Create a getting started guide from tests”

Your test suite is a gold mine of working code examples. Agent can extract them and turn them into a tutorial.

Step 5: Generate a migration guide for breaking changes

Section titled “Step 5: Generate a migration guide for breaking changes”

For major version releases, a migration guide is the difference between users upgrading smoothly and users staying on the old version forever.

@git
Generate a v1-to-v2 migration guide at docs/migration-v2.md:
1. Run `git diff v1.0.0..HEAD -- src/` to see all code changes
2. Identify every breaking change:
- Renamed functions or parameters
- Changed return types
- Removed APIs
- Changed default values
- New required parameters
3. For each breaking change, show:
- What the v1 code looked like
- What the v2 code should look like
- A one-line explanation of why the change was made
4. Group changes by category: API changes, type changes, behavior changes, removed features
5. Add a "Quick Migration" section with find-and-replace patterns for the most common changes
6. Add a "Codemods" section if any changes can be automated
Order from most common to least common -- the change that affects the most users goes first.

Step 6: Set up documentation freshness monitoring

Section titled “Step 6: Set up documentation freshness monitoring”

Documentation that falls out of sync with code is worse than no documentation — it actively misleads. Set up automated checks.

Cursor can generate Mermaid diagrams that render directly in GitHub, GitLab, and most documentation tools. For complex systems, break diagrams into focused views.

@src
Generate these Mermaid diagrams for the architecture docs:
1. Module Dependency Graph: Show which modules import from which other modules.
Use subgraphs for logical groupings (core, plugins, utils).
2. Request Lifecycle: A sequence diagram showing how a typical request flows
through the system from entry point to response.
3. State Machine: If there are any state transitions (connection states, transaction
states, etc.), generate a state diagram.
4. Class Hierarchy: Show inheritance and composition relationships between
the key classes/interfaces.
Output as fenced mermaid code blocks that I can paste directly into markdown files.

Generated docs describe what the code does, not why. The AI can read the implementation and explain the mechanics, but it cannot know the business context. Always review generated docs and add the “why” yourself — why does this function exist? What problem does it solve? When should a user reach for this instead of that?

Code examples do not actually run. The AI generates examples that look correct but have import errors, missing setup code, or assume context that is not present. The documentation CI check from Step 6 catches these, but during initial generation, always test the first few examples manually.

Architecture diagrams become unreadable. For large codebases, a full dependency graph is an unreadable hairball. Tell the AI to focus on a specific subsystem or limit the diagram to top-level modules. “Show me the top 10 most-imported modules and their relationships” is more useful than “show me everything.”

Migration guide misses subtle behavior changes. The AI detects API signature changes reliably but struggles with behavior changes where the function signature stays the same but the output changes. Supplement the automated migration guide with a manually written “Behavior Changes” section for anything you know changed but the code does not explicitly show.

Documentation generation bloats the codebase. Generated docs can be verbose. Set a word budget in your prompts (“Keep under 500 words”) and edit aggressively after generation. A concise doc that people actually read is infinitely more valuable than a comprehensive doc that nobody opens.

JSDoc annotations get stale. Inline documentation rots just like external docs. The pre-commit hook from Step 6 detects when function signatures change without documentation updates, but it cannot detect when the behavior changes while the signature stays the same. Periodic re-generation (monthly or per-release) catches drift.