✨ From vibe coding to vibe deployment. UBOS MCP turns ideas into infra with one message.

Learn more

MCP Server on Cloudflare Workers with Bearer Auth

This repository showcases a proof-of-concept implementation of a Model Context Protocol (MCP) server running on Cloudflare Workers, secured with simple bearer token authentication. This approach provides a lightweight and scalable solution for integrating external data and tools with Large Language Models (LLMs), making it ideal for building AI Agents.

What is an MCP Server?

At its core, an MCP (Model Context Protocol) server acts as a crucial intermediary, enabling seamless communication between AI models and the outside world. It provides a standardized way for applications to supply context to LLMs, enriching their understanding and capabilities. Think of it as a universal translator and secure gateway that allows AI to access real-time data, utilize specialized tools, and interact with various services.

Use Cases

The MCP Server architecture opens up a wide range of possibilities for AI Agent development and integration. Here are some prominent use cases:

  • Real-Time Data Access: Equip AI models with the ability to access and process current information, such as stock prices, weather data, news feeds, or social media trends. This eliminates the limitations of relying solely on pre-trained data and allows AI Agents to make informed decisions based on the latest available insights.
  • Tool Integration: Enable AI Agents to interact with external tools and services, such as calculators, search engines, CRM systems, or marketing automation platforms. This significantly expands the capabilities of AI Agents, allowing them to perform complex tasks and automate workflows across different applications.
  • Personalized Experiences: Tailor AI interactions based on user-specific data and preferences. By integrating with user profiles, purchase history, or browsing behavior, AI Agents can deliver highly personalized recommendations, content, and support.
  • Enterprise Data Connectivity: Securely connect AI models to internal enterprise data sources, such as databases, data warehouses, and APIs. This allows organizations to leverage their existing data assets to build custom AI solutions that address specific business challenges.
  • Workflow Automation: Automate complex business processes by chaining together multiple tools and data sources. For example, an AI Agent could automatically extract data from a document, perform calculations, and update a database, all without human intervention.

Key Features

This proof-of-concept implementation highlights the following key features:

  • Serverless Architecture: The MCP server is deployed on Cloudflare Workers, a serverless platform that provides scalability, reliability, and cost-efficiency. This eliminates the need for managing servers and infrastructure, allowing developers to focus on building and deploying AI solutions.
  • Bearer Token Authentication: The server implements basic bearer token authentication to secure access to MCP tools. Clients must provide an Authorization header with their requests, ensuring that only authorized users can interact with the server.
  • MCP Tool Integration: The server exposes MCP tools via a serverless architecture, allowing AI models to access and utilize these tools through the standardized MCP protocol. This simplifies the integration process and ensures interoperability between different AI systems.
  • Ease of Deployment: The project provides a simple and straightforward deployment process, allowing developers to quickly deploy their own MCP server to Cloudflare Workers with just a few commands.
  • Local Development Environment: The project includes a local development environment that allows developers to test and debug their MCP server before deploying it to Cloudflare Workers.

Local Development Setup

To get started with local development, follow these steps:

  1. Install Dependencies:

    bash npm install

  2. Run the Server Locally:

    bash npm run dev

    After starting the server, it will be available at http://localhost:8787.

Authentication Details

This implementation uses a simple bearer token authentication scheme. Clients must provide an Authorization header with their requests. The server passes this token to the MCP tools, allowing tools to perform actions based on the authenticated user.

Testing with MCP Inspector

You can use the MCP Inspector to test your MCP server:

  1. Install and Start the Inspector:

    bash npx @modelcontextprotocol/inspector

  2. Configure the Inspector:

    • Switch the Transport Type to SSE
    • Enter the URL of your MCP server (local: http://localhost:8787/sse or deployed: https://your-worker.workers.dev/sse)
    • Add a bearer token in the Authorization field
    • Click “Connect”
  3. Test the Functionality:

    • Click “List Tools” to see available tools
    • Try running the “getToken” tool, which will return your authorization header
    • Try the “add” tool with two numbers to test basic functionality

Deploying to Cloudflare

Deploy your MCP server to Cloudflare Workers:

bash npm run deploy

After deployment, your server will be available at https://your-worker.workers.dev

Project Structure Breakdown

  • src/index.ts - Contains the main server implementation, including MCP tools definitions.
  • src/utils.ts - Provides helper utilities for rendering the web interface.
  • wrangler.jsonc - Holds the Cloudflare Workers configuration.

Important Considerations

This is a proof-of-concept implementation and is intended to demonstrate how MCP can function within a serverless environment. For production deployments, you should implement the following:

  • More robust authentication mechanisms (OAuth, token validation, etc.).
  • Rate limiting and other security measures to protect against abuse.
  • Proper error handling and comprehensive monitoring.

Troubleshooting Tips

If you encounter any issues, consider the following:

  1. Ensure your bearer token is correctly formatted in the Authorization header.
  2. Check the worker logs in the Cloudflare dashboard for errors.
  3. Try restarting your local development server.

For persistent authentication problems, you might need to clear any cached credentials:

bash rm -rf ~/.mcp-auth

UBOS: The Full-Stack AI Agent Development Platform

UBOS is a comprehensive AI Agent development platform designed to empower businesses across all departments. We streamline the process of orchestrating AI Agents, connecting them with your enterprise data, building custom AI Agents using your LLM models, and creating sophisticated Multi-Agent Systems. By leveraging the MCP server architecture, UBOS can enhance its AI Agent capabilities with real-time data access, tool integration, and personalized experiences.

With UBOS, you can:

  • Orchestrate AI Agents: Design and manage complex AI Agent workflows with ease.
  • Connect to Enterprise Data: Securely integrate AI Agents with your existing data sources.
  • Build Custom AI Agents: Create tailored AI solutions that meet your specific business needs.
  • Develop Multi-Agent Systems: Build collaborative AI systems that can solve complex problems.

By integrating technologies like the MCP Server, UBOS ensures that your AI Agents are not only intelligent but also context-aware, adaptable, and seamlessly integrated into your existing business ecosystem. The future of AI in business is about intelligent agents working in harmony with your data and tools, and UBOS is here to make that future a reality.

Featured Templates

View More
AI Engineering
Python Bug Fixer
119 1433
Data Analysis
Pharmacy Admin Panel
252 1957
Customer service
AI-Powered Product List Manager
153 867
AI Assistants
Image to text with Claude 3
151 1366
AI Agents
AI Video Generator
252 2007 5.0

Start your free trial

Build your solution today. No credit card required.

Sign In

Register

Reset Password

Please enter your username or email address, you will receive a link to create a new password via email.