Skip to main content

title: “MCP starters” description: Level 2 - Build MCP servers and integrate them with AI agents and MCP-compatible clients

MCP (Model Context Protocol) is an open standard that lets AI agents connect to external tools through a unified interface. Instead of writing custom integrations for every API, you build an MCP server that exposes your APIs as tools, and any MCP-compatible agent can discover and use them automatically. This guide covers both sides of the MCP integration:
GuideWhat it covers
MCP serverBuild a pro-code MCP server with FastMCP, or expose APIs through Azure APIM using click-ops in the Azure Portal
MCP clientConnect to an MCP server from different clients: FastMCP, Cursor, Claude Desktop, and Agno agents

MCP server

There are two ways to create an MCP server:

Pro-code with FastMCP

Build a custom MCP server in Python using the FastMCP framework, starting from an OpenAPI spec or handwritten tool functions.

Click-ops with Azure APIM

Expose existing APIs provisioned in Azure APIM as MCP servers directly from the Azure Portal without writing custom code.

Pro-code MCP server with FastMCP

Build a production-ready MCP server using the FastMCP Python framework. The starter-mcp-server repository provides a working reference implementation you can clone and extend.

GitHub repository

View source code, releases, and issues

Why FastMCP?

FastMCP is the recommended way to build MCP servers in Python. The official MCP Python SDK provides a low-level protocol implementation that requires manual handler registration, hand-crafted JSON Schema dictionaries, and transport boilerplate. FastMCP removes that complexity.

Minimal boilerplate

A working MCP server is 5 lines of code. Register any Python function as a tool with a single decorator.

Automatic schema generation

Type hints become JSON Schema, Python docstrings become tool descriptions, and you don’t maintain schemas manually.

OpenAPI to MCP

Auto-generate an MCP server from any OpenAPI specification, turning every REST endpoint into an MCP tool.

Multiple transports

STDIO, Streamable HTTP, and SSE with a single configuration change.

Prerequisites

Install the following on your workstation:
  • Python 3.12+: Managed via UV
  • UV Package Manager: Modern Python package manager that replaces pip and poetry

Quick start

Run the starter MCP server on your machine:
1

Clone and install UV

git clone https://github.com/bb-ecos-agbs/starter-mcp-server.git
cd starter-mcp-server

# Install UV (macOS)
brew install uv

# Install UV (Linux/WSL)
curl -LsSf https://astral.sh/uv/install.sh | sh
2

Set up environment

# Create virtual environment
uv venv --python 3.12
source .venv/bin/activate  # macOS/Linux
# Or .venv\Scripts\activate  # Windows

# Copy environment template
cp .env.example .env
3

Configure environment

Edit .env with your values:
# Base URL of the upstream API that this MCP server proxies
API_BASE_URL=https://api.escuelajs.co

# Server configuration
MCP_TRANSPORT=streamable-http  # "stdio" | "streamable-http"
MCP_SERVER_PORT=8557
MCP_SERVER_HOST=0.0.0.0
MCP_STATELESS_HTTP=true  # "true" = no session state between requests, "false" = stateful

# Optional: VPN and web proxy settings (httpx reads these automatically)
# HTTP_PROXY=http://webproxy.infra.backbase.cloud:8888
# HTTPS_PROXY=http://webproxy.infra.backbase.cloud:8888
# NO_PROXY=localhost,127.0.0.1
4

Install dependencies and run

uv sync

# Run the server
uv run python -m src.main
The MCP server runs at http://localhost:8557/mcp.
VPN and web proxy: if your upstream API is behind a corporate network, uncomment, and configure the proxy settings in .env. For setup instructions, see the Onboarding guide on Confluence.

Project structure

The starter-mcp-server repository contains the following directories and files:
starter-mcp-server/
├── .github/                         # CI/CD workflows
│   └── workflows/
│       ├── pull-request-check.yaml
│       ├── build-publish.yaml
│       ├── release-draft.yaml
│       ├── release.yaml
│       └── repository-provisioning.yaml
├── src/
│   ├── main.py                      # Entry point
│   └── starter_mcp_server.py        # MCP server (tools, config, OpenAPI)
├── tests/
│   ├── test_config.py
│   ├── test_header_forwarding.py
│   ├── test_main.py
│   ├── test_server.py
│   └── test_tools.py
├── platzi-fake-store-api.yaml       # Sample OpenAPI spec
├── .env.example                     # Environment template
├── Dockerfile                       # Container definition
└── pyproject.toml                   # Dependencies

Create an MCP server

The following sections show patterns from a minimal server through OpenAPI generation, custom tools, routing, and deployment.

Minimal example

This example registers one tool and starts the server:
from fastmcp import FastMCP

mcp = FastMCP(name="MyServer")

@mcp.tool
def add(a: int, b: int) -> int:
    """Add two numbers."""
    return a + b

if __name__ == "__main__":
    mcp.run()

Create tools with @mcp.tool

Decorate any Python function to expose it as a tool. FastMCP auto-generates the name, description, and input schema from the function signature.
from typing import Annotated
from pydantic import Field
from fastmcp import FastMCP
from fastmcp.exceptions import ToolError

mcp = FastMCP(name="CalculatorServer")

@mcp.tool(
    name="find_products",
    description="Search the product catalog with optional category filtering.",
    tags={"catalog", "search"},
    timeout=30.0,
    annotations={"readOnlyHint": True},
)
def search_products(query: str, category: str | None = None) -> list[dict]:
    return [{"id": 2, "name": "Widget"}]

@mcp.tool
def process_image(
    image_url: Annotated[str, "URL of the image to process"],
    width: Annotated[int, Field(description="Target width", ge=1, le=2000)] = 800,
) -> dict:
    """Process an image with optional resizing."""
    ...

@mcp.tool
def divide(a: float, b: float) -> float:
    """Divide a by b."""
    if b == 0:
        raise ToolError("Division by zero is not allowed.")
    return a / b
Supported type annotations:
TypeExample
Basic typesint, float, str, bool
Collectionslist[str], dict[str, int]
Optionalfloat | None, Optional[float]
ConstrainedLiteral["A", "B"], Enum
Pydantic modelsUserData
ToolError messages are always sent to clients. When mask_error_details=True, FastMCP masks other exceptions from clients.

Create from an OpenAPI spec

FastMCP can auto-generate an MCP server from any OpenAPI specification. Every endpoint becomes a tool that forwards requests to the underlying API. The starter-mcp-server uses this approach with the Platzi Fake Store API.
import httpx
import yaml
from fastmcp import FastMCP

with open("my-api-spec.yaml") as f:
    openapi_spec = yaml.safe_load(f)

client = httpx.AsyncClient(base_url="https://api.example.com")

mcp = FastMCP.from_openapi(
    openapi_spec=openapi_spec,
    client=client,
    name="My API Server",
)

if __name__ == "__main__":
    mcp.run()

Integrate a REST API as a tool

Instead of auto-generating from an OpenAPI spec, wrap any API call as a hand-written tool function:
import httpx
from fastmcp import FastMCP

mcp = FastMCP("API Tools Server")

@mcp.tool
async def get_payment_orders(status: str | None = None) -> dict:
    """Retrieve payment orders, optionally filtered by status."""
    async with httpx.AsyncClient() as client:
        params = {}
        if status:
            params["status"] = status
        response = await client.get(
            "https://api.example.com/client-api/v3/payment-orders",
            params=params,
            headers={"Authorization": "Bearer TOKEN"},
        )
        response.raise_for_status()
        return response.json()

Include and exclude tools

Use RouteMap to control which endpoints your MCP server exposes:
from fastmcp import FastMCP
from fastmcp.server.openapi import RouteMap, MCPType

mcp = FastMCP.from_openapi(
    openapi_spec=spec,
    client=client,
    route_maps=[
        RouteMap(pattern=r"^/admin/.*", mcp_type=MCPType.EXCLUDE),
        RouteMap(tags={"internal"}, mcp_type=MCPType.EXCLUDE),
        RouteMap(methods=["GET"], pattern=r"^/.*", mcp_type=MCPType.TOOL),
        RouteMap(mcp_type=MCPType.EXCLUDE),
    ],
)
FastMCP evaluates route maps in order. The first match wins.
MCPType values:
ValueDescription
MCPType.TOOLExpose as an MCP Tool
MCPType.RESOURCEExpose as an MCP Resource
MCPType.RESOURCE_TEMPLATEExpose as a Resource Template
MCPType.EXCLUDEExclude from the MCP server entirely

Forward client headers

When your MCP server proxies an authenticated API, forward headers from the MCP client request to upstream API calls. The starter-mcp-server includes this pattern for forwarding Authorization headers:
from typing import Any
import httpx
from fastmcp.server.dependencies import get_http_request

class HeaderForwardingClient(httpx.AsyncClient):
    """Forwards auth headers from the MCP client request to the upstream API."""

    async def send(self, request: httpx.Request, **kwargs: Any) -> httpx.Response:
        try:
            incoming = get_http_request()
            auth = incoming.headers.get("Authorization", "")
            if auth:
                request.headers["Authorization"] = auth
        except Exception:
            pass
        request.headers["Content-Type"] = "application/json"
        return await super().send(request, **kwargs)

api_client = HeaderForwardingClient(base_url="https://api.example.com")

mcp = FastMCP.from_openapi(
    openapi_spec=spec,
    client=api_client,
    name="My Authenticated API Server",
)
Header forwarding only works with HTTP transports such as streamable-http. It doesn’t apply to STDIO transport because there is no HTTP request context.

Proxy bridge

The Proxy Provider enables transport bridging, server aggregation, and gateway patterns:
from fastmcp.server import create_proxy

# Bridge HTTP server to local stdio
http_proxy = create_proxy("http://example.com/mcp/sse", name="HTTP-to-stdio")

if __name__ == "__main__":
    http_proxy.run()  # Defaults to stdio

Run the server

Start the server with transport, host, port, and path options:
if __name__ == "__main__":
    mcp.run(
        transport="streamable-http",  # "stdio" (default) | "streamable-http"
        host="0.0.0.0",
        port=8557,
        path="/mcp",
        stateless_http=True,
    )
Transport comparison:
FeatureSTDIOStreamable HTTP
Network accessNoYes
Multiple clientsNoYes
Production readyLocal onlyYes
Header forwardingN/AYes
Add a health check alongside the MCP endpoint:
from starlette.requests import Request
from starlette.responses import JSONResponse

@mcp.custom_route("/health", methods=["GET"])
async def health(request: Request) -> JSONResponse:
    return JSONResponse({"status": "ok"})

Development

Use these commands while you change the server or add tools:

Run tests

Execute the test suite with pytest:
uv sync --extra dev
uv run pytest

Build Docker image

Build a local image tag for the service:
docker build -t starter-mcp-server:local .

CI/CD

The .github/workflows directory defines these standard workflows:
  • PR checks: Linting, testing, and validation
  • Build and publish: Docker image creation on merge
  • Release: Automated versioning and release notes
See CI/CD workflows for pipeline details.

Expose an API as an MCP server on Azure APIM

Azure APIM exposes your existing APIs as MCP servers without custom MCP server code. You select API operations in the Azure Portal, and APIM handles MCP protocol translation, tool discovery, and invocation. This approach is useful when you already have APIs provisioned in APIM and want to make them available to AI agents without building a separate FastMCP server.

Prerequisites

Before you can expose an API as an MCP server on APIM:
  1. Ensure network connectivity. The platform must have network connectivity to the target APIs. If it doesn’t, raise a ticket with the Service Desk to request access.
  2. Upload your API spec to APIM and register your API in Azure APIM using applications-live.

High-level steps

Configure MCP in APIM through the portal:
  1. Navigate to APIs → MCP Servers (Preview) in your APIM instance
  2. Click Create MCP Server → Expose an API as an MCP Server
  3. Select the API and operations to expose, starting with read-only endpoints
  4. Configure APIM policies for security, including subscription key validation, rate limiting, and audit logging
  5. Test with MCP Inspector or curl
Known limitation: POST operations with requestBody.content sections in the OpenAPI spec cause /tools/list to hang indefinitely. Remove or simplify the requestBody.content section in your OpenAPI spec before you expose the API via MCP. See the full guide for details.
For complete click-ops instructions that cover APIM policies, security configuration, rate limiting, audit logging, external MCP server proxying, and known issues, see the APIM MCP: click-ops guide on Confluence.

MCP client

Connect to an MCP server from different types of clients, including programmatic Python clients, AI-native tools like Cursor and Claude Desktop, and AI agent frameworks like Agno.

GitHub repository

View source code, releases, and issues
You need a running MCP server to connect to. See MCP server to create one.

Transport types

Before connecting, determine which transport your MCP server uses:
TransportBest forHow it works
Streamable HTTPProduction remote servers (recommended)Client connects to an HTTP endpoint
STDIOLocal development, CLI serversClient spawns the server as a sub-process
SSELegacy remote serversHTTP with server-sent events (deprecated)
SSE transport is deprecated by MCP. Always use Streamable HTTP for deployments. Use STDIO only for local development or CLI-based MCP servers such as npx and uvx packages.

FastMCP client

FastMCP provides a built-in Client class for connecting to any MCP server programmatically. The client infers the transport from what you pass to it.
import asyncio
from fastmcp import Client

async def main():
    async with Client("https://example.com/mcp") as client:
        tools = await client.list_tools()
        for tool in tools:
            print(f"  {tool.name}: {tool.description}")

        result = await client.call_tool("get_payment_orders", {"status": "ACCEPTED"})
        print(result)

asyncio.run(main())

Cursor IDE

Cursor has built-in MCP support (v0.40+). Create .cursor/mcp.json in your project root:
{
  "mcpServers": {
    "my-mcp-server": {
      "type": "streamableHttp",
      "url": "https://api.example.com/mcp",
      "headers": {
        "Authorization": "Bearer your-token",
        "X-User-Context": "user-123"
      }
    }
  }
}
Restart Cursor after adding the configuration. MCP servers only load at startup.

Claude Desktop

Claude Desktop reads its MCP configuration from claude_desktop_config.json:
OSPath
macOS~/Library/Application Support/Claude/claude_desktop_config.json
Windows%APPDATA%\Claude\claude_desktop_config.json
Linux~/.config/Claude/claude_desktop_config.json
{
  "mcpServers": {
    "my-mcp-server": {
      "command": "python",
      "args": ["path/to/my_mcp_server.py"]
    }
  }
}

Agno agent

Agno is a Python framework for building AI agents with built-in MCP support through its MCPTools class.
from agno.agent import Agent
from agno.models.openai import OpenAIChat
from agno.tools.mcp import MCPTools, StreamableHTTPClientParams

params = StreamableHTTPClientParams(
    url="http://api.example.com/mcp",
    headers={
        "Authorization": "Bearer your-token",
        "X-User-Context": "user-123",
    },
    timeout=30,
    terminate_on_close=True,
)

mcp_tools = MCPTools(
    transport="streamable-http",
    server_params=params,
    timeout_seconds=60,
)
await mcp_tools.connect()

agent = Agent(
    name="My MCP Agent",
    model=OpenAIChat(id="gpt-4o-mini"),
    tools=[mcp_tools],
    instructions=["Use the available tools to answer user questions."],
    markdown=True,
)

await agent.aprint_response("Show me my payment orders")
await mcp_tools.close()

Pass headers to MCP servers

How you pass authentication headers depends on your client type:
from fastmcp import Client
from fastmcp.client.transports import StreamableHttpTransport

transport = StreamableHttpTransport(
    url="https://api.example.com/mcp",
    headers={"Authorization": "Bearer your-token"},
)

async with Client(transport) as client:
    result = await client.call_tool("get_orders", {})

Next steps

From here you can extend agents, teams, knowledge, and pipelines: