AI agents need a way to call your APIs. The Model Context Protocol (MCP) gives them exactly that -- a standardized interface for discovering and invoking API operations as tools. But building an MCP server from scratch means writing request handlers, mapping endpoints to tool definitions, managing authentication, and hosting it all somewhere reliable.
With Zuplo, you can skip all of that. Drop in your OpenAPI spec and Zuplo's MCP Server Handler automatically exposes your API endpoints as MCP tools. No custom code. No infrastructure to manage. In this tutorial, you'll go from an OpenAPI spec to a deployed, secure MCP server in under five minutes.
Prerequisites
Before you start, make sure you have:
- An OpenAPI spec (v3.x) -- A valid OpenAPI 3.0 or 3.1 document describing your API endpoints. If you don't have one yet, we'll provide an example below.
- A Zuplo account -- The free tier works for this tutorial. Sign up here if you haven't already.
- Node.js installed -- Required for the Zuplo CLI. Version 18 or later is recommended.
Step 1: Create a Zuplo Project
Start by creating a new Zuplo project. You can do this from the Zuplo Portal dashboard or from the command line using the CLI:
The CLI scaffolds a project with the standard Zuplo structure, including a
config/routes.oas.json file where your API routes are defined and a
config/zuplo.jsonc configuration file.
If you prefer the portal, click New Project, give it a name, and you'll land in the Route Designer where you can configure everything visually.
Step 2: Add Your OpenAPI Spec
Your OpenAPI spec is the foundation of the MCP server. Zuplo reads it to understand your API's endpoints, parameters, request bodies, and descriptions -- then maps each operation to an MCP tool automatically.
Replace the contents of config/routes.oas.json with your own OpenAPI document.
If you want to follow along with an example, here's a simple todo API spec with
three endpoints:
Two things matter here for MCP tool quality:
operationId-- Each operation needs a uniqueoperationId. This becomes the tool name that AI agents see and call.description-- Write clear, concise descriptions for every operation and parameter. AI agents rely on these descriptions to understand when and how to use each tool.
Update the baseUrl values to point to your actual backend API. Zuplo acts as a
gateway, forwarding requests to your backend while adding the MCP layer on top.
Step 3: Enable the MCP Server Handler
Now create a second OpenAPI file for your MCP server endpoint. Add a new file at
config/mcp.oas.json:
The key configuration is in the handler options:
name-- The display name of your MCP server, visible to AI clients.version-- The version of your MCP server.sourceRouteFile-- Points to your main OpenAPI file (routes.oas.json). Zuplo reads this file to generate MCP tool definitions from your API endpoints.
That's it. Save the file and Zuplo handles the rest -- parsing your OpenAPI
spec, generating tool schemas, and serving the MCP protocol at /mcp.
Step 4: Add Authentication
Before deploying, you should secure your MCP server so only authorized clients can access it. Zuplo makes this straightforward with inbound policies.
Add an API key authentication policy to your MCP endpoint by updating the
policies section in config/mcp.oas.json:
Then define the policy in your config/policies.json file:
Once deployed, you can create and manage API keys from the Zuplo portal under
the API Key Consumers section. Each consumer gets a unique key that must be
included in the Authorization header of MCP requests.
This is especially important for MCP servers because AI agents will be making automated calls to your API. Without authentication, anyone who discovers your MCP endpoint could use it freely.
Step 5: Deploy
Deploy your project with a single command:
Zuplo deploys your API gateway and MCP server to its global edge network. Once the deployment completes, you'll see the URL of your live gateway -- something like:
Your MCP server is now live at
https://my-mcp-server-main-abc1234.zuplo.dev/mcp and ready to accept
connections from any MCP-compatible client.
If you're working in the Zuplo Portal instead of the CLI, click Save and the deployment happens automatically.
Step 6: Test with an MCP Client
With your MCP server deployed, connect to it from an MCP client to verify everything works. Here's how to set it up with a few popular clients.
Claude Desktop
Open your Claude Desktop configuration file and add your MCP server:
Restart Claude Desktop and you should see your API tools listed in the tools menu. Try asking Claude to "list all todos" and watch it call your API through the MCP server.
Cursor
In Cursor, go to Settings > MCP and add a new server with the same URL and authorization header. Cursor's AI assistant will then be able to use your API tools when answering questions or writing code.
MCP Inspector
For debugging and testing, the MCP Inspector is an excellent tool. Point it at your MCP server URL and you can browse available tools, see their schemas, and invoke them manually to verify the request and response mapping.
What Happens Under the Hood
When you set up the MCP Server Handler, Zuplo does the following automatically:
- Parses your OpenAPI spec -- It reads
routes.oas.jsonand extracts every operation that has anoperationId. - Generates MCP tool definitions -- Each operation becomes a tool. The
operationIdbecomes the tool name. Thesummaryanddescriptionfields become the tool's description that AI agents use to decide when to call it. Parameters and request body schemas are converted into the tool's input schema. - Handles protocol negotiation -- The
/mcpendpoint speaks the MCP protocol, handling theinitialize,tools/list, andtools/callmessages that clients send. - Forwards requests to your backend -- When an AI agent calls a tool, Zuplo maps the tool invocation back to the corresponding HTTP request (method, path, parameters, body) and forwards it to your backend via the URL forward handler.
- Applies policies -- Any inbound policies you've configured (authentication, rate limiting, request validation) run before the request reaches your backend.
The result is that your existing API becomes AI-accessible without changing a single line of your backend code. The OpenAPI spec you already maintain is the single source of truth for both human-facing documentation and AI-facing tool definitions.
Next Steps
You now have a working MCP server backed by your OpenAPI spec. Here are some ways to build on this foundation:
- Add rate limiting -- Protect your backend from aggressive AI agents by adding a rate limiting policy. This is critical in production since AI agents can generate high request volumes.
- Enable request validation -- Add the JSON schema validation policy to ensure AI agents send well-formed requests that match your OpenAPI schema.
- Add monitoring -- Use Zuplo's built-in analytics to track which tools AI agents call most frequently, monitor error rates, and understand usage patterns.
- Explore MCP prompts -- Go beyond tools by adding MCP prompts that guide AI agents through multi-step workflows with your API.
- Set up an MCP Gateway -- If you're managing multiple MCP servers across teams, Zuplo's MCP Gateway provides centralized governance, access control, and observability.
For the full documentation on Zuplo's MCP support, see the MCP Server docs.
Get Started
Zuplo's MCP Server Handler is available on all plans, including the free tier. If you already have an OpenAPI spec, you're five minutes away from a deployed MCP server.
Sign up for Zuplo and turn your API into an AI-ready tool today.