Context7 MCP Server

Context7 MCP Server

Context7 MCP Server enables LLMs to access up-to-date, version-specific documentation and code examples from a wide range of libraries and packages, directly injecting this information into your prompts to eliminate outdated code, hallucinated APIs, and generic answers.

Author: upstash


View Protocol

What is Context7 MCP Server?

Context7 MCP Server is an open-source MCP server that acts as a dynamic documentation and code example bridge for LLMs and AI development tools. By connecting via the Model Context Protocol (MCP), it fetches and delivers current, authoritative docs and practical examples for libraries or frameworks used in your prompts, keeping coding answers accurate and relevant. It is compatible with popular editors and tools like Cursor, Claude Desktop, Windsurf, Zed, and more.

How to Configure Context7 MCP Server

You can configure Context7 MCP Server by adding it to your tool’s MCP configuration. Common approaches are:

  • NodeJS: Use npx (or bunx/deno) to run the latest package by updating your MCP client config:
    {
      "mcpServers": {
        "context7": {
          "command": "npx",
          "args": ["-y", "@upstash/context7-mcp@latest"]
        }
      }
    }
    
  • Editors/IDEs: Go to settings (e.g. Cursor or VS Code) and add Context7 as a global MCP server using the recommended config, or install via the appropriate extension marketplace.
  • Docker: Build and run the provided Dockerfile, then adjust your config to execute the Docker command as your MCP server.
  • Smithery: Use npx -y @smithery/cli install @upstash/context7-mcp --client claude for Claude Desktop integration.
  • Ensure Node.js v18+ for best compatibility, and consult your tool’s MCP docs for location-specific config details.

How to Use Context7 MCP Server

  1. Enable Context7 in your prompt: As you write a coding prompt, add use context7 at the end of your instruction (e.g., Create a basic Next.js project with app router. use context7).
  2. LLM Integration: The LLM or AI assistant detects the use context7 command, and automatically queries Context7 MCP for relevant, up-to-date documentation and code samples.
  3. Prompt Enhancement: The fetched and injected content improves the LLM's output—no more outdated APIs or hallucinated responses.
  4. No Tab-Switching Needed: Everything happens within the context of your development chat, IDE, or coding assistant.

Key Features

  • Latest Documentation: Pulls official, up-to-date docs and code samples for thousands of libraries.
  • Multi-Tool Support: Seamlessly integrates with Cursor, Claude Desktop, VS Code, Zed, and any MCP-compatible agent or IDE.
  • Plug-and-Play Setup: Simple config—works via npx, bunx, deno, Docker, or marketplace extensions.
  • Dynamic Context Injection: Delivers contextually-relevant docs directly into LLM prompts.
  • Tools API: Exposes executable actions (like resolve-library-id and get-library-docs) through the MCP protocol.
  • Open & Extensible: Community-driven, MIT licensed, and easily adaptable to new environments or workflows.

Use Cases

  • AI Pair Programming: Boost LLM code completions with current examples and version-matched APIs.
  • Error Debugging: Instantly pull in relevant troubleshooting steps and documentation for resolving coding issues.
  • Learning New Frameworks: Query best-practice code and guides for unfamiliar libraries and packages.
  • Legacy Migration: Get accurate syntax and API changes for package upgrades, minimizing manual research.

FAQ

Q1: What should I do if I get an ERR_MODULE_NOT_FOUND error when starting Context7 MCP Server?
Try using bunx instead of npx in your configuration. This often resolves module resolution issues, especially in environments where npx does not work as expected or Node ESM handling causes trouble.

Q2: Can I run Context7 MCP Server in a Docker container?
Yes! Build a Docker image using the provided Dockerfile, then update your MCP client configuration to execute the Docker run command as shown in the guide. Make sure the Docker daemon is running and the image tag matches your config.

Q3: How do I use Context7 from different editors like VS Code, Cursor, or Zed?
There are ready-to-use configuration snippets for each major editor and tool. Either use the extension/marketplace, or manually add the correct server definition to your MCP config file as demonstrated above.

Q4: How do I ensure that the documentation provided is accurate and up-to-date?
Context7 sources docs and examples directly from the latest available official sources and indexed repositories. However, as a community-driven project, always double-check content—use the "Report" feature if you spot inaccuracies.

Q5: The LLM is generating generic or out-of-date code—what might be wrong?
Make sure use context7 appears in your prompt, and confirm your MCP configuration is working (test via the MCPI Inspector). Also, check your network/firewall settings if running locally or via Docker.