By mastra-aiCreated 13 days ago
starstarstarstarstar

Client implementation for Mastra, integrating MCP-compatible AI models and tools.

Visit Project
Share this MCP:
X (Formerly Twitter)RedditblueskyThreads by Instagram

Category

Official MCP Server

Tags

Ai FrameworkTypescriptLlm Integration

What is Mastra MCP?

Mastra MCP is a client implementation for Mastra, a TypeScript framework designed to build AI applications and features quickly. It integrates MCP-compatible AI models and tools to provide a comprehensive suite for developers.

How to Use Mastra MCP?

  1. Prerequisites: Ensure you have Node.js (v20.0+).
  2. Get an LLM Provider API Key: Obtain API keys from providers like OpenAI, Anthropic, Google Gemini, Groq, or Cerebras.
  3. Create a New Project: Use the CLI tool npx create-mastra@latest to set up a new Mastra application.
  4. Run the Script: Execute npx masra dev to open the Mastra playground.
  5. Configure MCP Server: Set up the MCP server in Cursor or Windsurf by configuring mcp.json or mcp_config.json respectively.
  6. Enable MCP Server: Enable the MCP server in Cursor settings or Windsurf to access Mastra documentation tools.

Key Features of Mastra MCP?

  • LLM Models: Uses Vercel AI SDK for model routing, supporting providers like OpenAI, Anthropic, and Google Gemini.
  • Agents: Systems where LLMs choose action sequences, with access to tools, workflows, and synced data.
  • Tools: Typed functions for agents or workflows, with integration access and parameter validation.
  • Workflows: Durable graph-based state machines with loops, branching, and error handling.
  • RAG: Retrieval-augmented generation for constructing knowledge bases.
  • Integrations: Auto-generated, type-safe API clients for third-party services.
  • Evals: Automated tests to evaluate LLM outputs using various methods.

Use Cases of Mastra MCP?

  1. Building AI-powered applications with integrated LLM models.
  2. Creating agents for specific tasks or workflows.
  3. Developing workflows for complex processes.
  4. Enhancing applications with RAG-based knowledge bases.
  5. Testing and evaluating LLM outputs with built-in evals.

FAQ from Mastra MCP?

  • How to get API keys for LLM providers?

Obtain API keys from providers like OpenAI, Anthropic, Google Gemini, etc., by signing up and following their documentation.

  • What are the prerequisites for using Mastra MCP?

Node.js (v20.0+) and API keys from supported LLM providers.

  • What is the Model Context Protocol (MCP) server?

The MCP server provides AI assistants with direct access to Mastra's complete knowledge base.

  • How to contribute to Mastra MCP?

Contributions are welcome. Open an issue to discuss before submitting a Pull Request.

  • Where to find support for Mastra MCP?

Join the open community Discord or leave a star on the GitHub project.