OpenRouter MCP Client for Cursor
A Model Context Protocol (MCP) client for Cursor that utilizes OpenRouter.ai to access multiple AI models.
Requirements
- Node.js v18.0.0 or later (important!)
- OpenRouter API key (get one at openrouter.ai/keys)
Features
- Connect to OpenRouter.ai via MCP
- Access multiple AI models from various providers (Google, DeepSeek, Meta, etc.)
- Use MCP transport mechanism to communicate with Cursor
- Cache model information to reduce API calls
- Support for both free and paid models
- Multi-model completion utility to combine results from multiple models
Available Models
This client provides access to all models available on OpenRouter, including:
- Google Gemini 2.5 Pro
- DeepSeek Chat v3
- Meta Llama 3.1
- DeepSeek R1
- Qwen Coder
- Mistral Small 3.1
- And many more!
Quick Installation
The easiest way to install is to use the setup script:
# Clone the repository
git clone https://your-repo-url/openrouter-mcp-client.git
cd openrouter-mcp-client
# Run the installation script
node install.cjs
The script will:
- Help you create a
.env
file with your OpenRouter API key - Install all dependencies
- Build the project
- Provide next steps
Manual Installation
If you prefer to install manually:
# Clone the repository
git clone https://your-repo-url/openrouter-mcp-client.git
cd openrouter-mcp-client
# Install dependencies
npm install
# Copy environment file and edit it with your API key
cp .env.example .env
# Edit .env file with your OpenRouter API key
# Build the project
npm run build
Configuration
Edit the .env
file with your OpenRouter API key and default model:
OPENROUTER_API_KEY=your_api_key_here
OPENROUTER_DEFAULT_MODEL=google/gemini-2.5-pro-exp-03-25:free
Get your API key from OpenRouter Keys.
Cursor Integration
To use this client with Cursor, you need to update Cursor’s MCP configuration file:
Find Cursor’s configuration directory:
- Windows:
%USERPROFILE%.cursor
- macOS:
~/.cursor/
- Linux:
~/.cursor/
- Windows:
Edit or create the
mcp.json
file in that directory. Add a configuration like this:
{
"mcpServers": {
"custom-openrouter-client": {
"command": "node",
"args": [
"FULL_PATH_TO/openrouter-mcp-client/dist/index.js"
],
"env": {
"OPENROUTER_API_KEY": "your_api_key_here",
"OPENROUTER_DEFAULT_MODEL": "google/gemini-2.5-pro-exp-03-25:free"
}
}
}
}
Replace FULL_PATH_TO
with the actual path to your client installation.
Restart Cursor
Select the client by:
- Opening Cursor
- Press Ctrl+Shift+L (Windows/Linux) or Cmd+Shift+L (macOS) to open the model selector
- Choose “custom-openrouter-client” from the list
Direct Testing (without MCP)
Uncomment the test functions in src/index.ts
to test direct API interaction:
// Uncomment to test the direct API
testDirectApi().catch(console.error);
testMultiModelCompletion().catch(console.error);
Then run:
npm start
Development
# Watch mode for development
npm run dev
Troubleshooting
Node.js Version Requirements
Important: This project requires Node.js v18.0.0 or later. If you’re using an older version, you will see EBADENGINE warnings and may encounter errors. To check your Node.js version:
node --version
If you have an older version, download and install the latest LTS version from nodejs.org.
Module System Errors
If you encounter errors related to ES modules vs CommonJS:
- The main codebase uses ES modules (indicated by
"type": "module"
in package.json) - The installation script uses CommonJS (with a .cjs extension)
- Make sure to run the installation script with
node install.cjs
Cursor Not Connecting
If Cursor doesn’t seem to connect to your client:
- Make sure the path in
mcp.json
is correct and uses forward slashes - Check that you’ve built the client with
npm run build
- Verify that your OpenRouter API key is correct in the env settings
- Check Cursor logs for any errors
Related Resources
- OpenRouter Documentation
- Model Context Protocol (MCP) Documentation
- Cursor Editor
Smithery Deployment
You can deploy this MCP client to Smithery to make it available to various AI agents and applications.
Prerequisites
- GitHub account
- OpenRouter API key
Steps to Deploy
Fork this repository to your GitHub account
Sign in to Smithery at smithery.ai using your GitHub account
Add a new server:
- Click “Add Server” in the Smithery dashboard
- Select your forked repository
- Configure the build settings:
- Set the base directory to the repository root
- Ensure the Dockerfile and smithery.yaml are detected
Deploy your server:
- Smithery will automatically build and deploy your MCP server
- Once deployed, users can configure it with their own OpenRouter API key
Using Your Deployed Server
After deployment, users can access your server through the Smithery registry:
- In their MCP client (like Claude, or any other MCP-compatible client), add the server using Smithery’s registry
- Configure their OpenRouter API key and preferred default model
- Start using the server to access multiple AI models through OpenRouter
Smithery vs. Local Installation
- Smithery: Easier for distribution to others; no need for users to clone and build the repository
- Local Installation: Better for personal use and development; more control over the environment
Choose the approach that best fits your needs.
OpenRouter Client for Cursor
Project Details
- palolxx/openroutermcptest4
- Last Updated: 5/28/2025
Recomended MCP Servers
Azure API Management as AI Gateway to Remote MCP servers.
A Model Context Protocol (MCP) server for Windows desktop automation using AutoIt.

An MCP server implementation that integrates with 4o-image API, enabling LLMs and other AI systems to generate and...
A working example to create a FastAPI server with SSE-based MCP support
MCP server for Huntress API integration
apollo.io mcp server typescript
Osmosis protocol tools for LLMs
一个强大的MCP(Model Context Protocol)开发框架,一个用于SEE(Standard Extension Environment)对接的模块化工具框架。该框架允许开发者轻松创建和扩展自定义工具,并通过MCP协议与模型交互。