Use Snag Docs with Your AI Coding Assistant

Supercharge your development workflow by connecting your favorite AI coding assistants—like Cursor, WindSurf, or Claude Code—to Snag Docs. There are two main ways to do this:

1. Connect via Model Context Protocol (MCP)

The Model Context Protocol (MCP) lets you connect AI tools directly to Snag Docs for real-time, context-aware answers and code suggestions.

1

Generate your MCP server

Run the following command to generate an MCP server from Snag Docs:

npx mint-mcp add snag

This sets up an MCP server that your AI tools can automatically detect and use.

2

Start using with your AI assistant

Once the server is running, open your AI coding assistant. It should automatically detect and integrate with the MCP server for Snag Docs.

You can now ask questions and get context-aware code suggestions powered by your own documentation.

2. Index Snag Docs via llms.txt or llms-full.txt

AI tools can index your documentation for smarter, more relevant answers by using one of two special files:

  • llms.txt – A machine-readable map of your documentation, optimized for LLMs.
  • llms-full.txt – A single file containing all your documentation content, ideal for bulk ingestion.
1

Point your AI tool to `llms.txt` or `llms-full.txt`

Many AI editors allow you to provide a link to documentation for indexing. Use one of the following:

Some editors (like Cursor) automatically index documentation you add in settings. Others may require referencing the docs in the prompt using an @ symbol or similar. For example:

@docs What is the loyalty program setup process?

Check your tool’s documentation for exact syntax.

2

Verify indexing and start coding

After linking your documentation, your AI assistant will use Snag Docs as a knowledge base for completions, suggestions, and answers.

You’ll get smarter, context-aware help based on the latest Snag Docs content.


By connecting your AI coding assistant to Snag Docs, you unlock smarter, faster, and more accurate development support—right where you need it.