Frequently Asked Questions about Bolt.diy
What is Bolt.diy?
Bolt.diy (formerly oTToDev and Bolt.new ANY LLM) is an open-source AI-powered full-stack web development tool that allows you to prompt, run, edit, and deploy web applications directly from your browser. It supports multiple LLMs and integrates with StackBlitz’s WebContainers for a complete development environment.
What makes Bolt.diy different from other AI coding assistants?
Bolt.diy stands out because it provides a full-stack in-browser development environment where AI models have complete control over the environment, including the file system, node server, package manager, and terminal. This allows for more comprehensive AI-driven development.
Which LLMs are supported by Bolt.diy?
Bolt.diy currently supports OpenAI, Anthropic, Ollama, OpenRouter, Gemini, LMStudio, Mistral, xAI, HuggingFace, DeepSeek, and Groq models. It can be extended to use any other model supported by the Vercel AI SDK.
Do I need API keys to use Bolt.diy?
Yes, you need API keys for the LLMs you want to use, such as OpenAI, Anthropic, or Groq. Ollama does not require an API key as it runs locally.
How do I install Bolt.diy?
- Install Git and Node.js.
- Clone the Bolt.diy repository from GitHub.
- Rename
.env.exampleto.env.localand add your LLM API keys. - Install dependencies using
pnpm install. - Start the application with
pnpm run dev.
Can I run Bolt.diy with Docker?
Yes, Bolt.diy can be run with Docker. Use the provided NPM scripts (npm run dockerbuild for development or npm run dockerbuild:prod for production) or use Docker’s target feature.
How do I contribute to Bolt.diy?
Check the CONTRIBUTING.md file for detailed instructions on how to contribute. You can contribute by adding new LLM support, improving the main prompt, implementing new features, or enhancing the documentation.
Where can I find the roadmap for Bolt.diy?
You can find the roadmap here, which outlines future plans and features.
How can I integrate Bolt.diy with UBOS?
Integrating Bolt.diy with UBOS allows you to orchestrate complex AI workflows, connect AI agents to enterprise data sources securely, build custom AI agents, and deploy/manage them at scale. UBOS enhances Bolt.diy’s capabilities for comprehensive AI integration.
What are the prerequisites for running Bolt.diy?
You need Git and Node.js installed. If you are using Docker, you will need Docker installed as well.
What do I do if I get a ‘command not found: pnpm’ error?
If you get this error, you need to install pnpm globally using the command: sudo npm install -g pnpm
How do I update Bolt.diy to the latest version?
To update Bolt.diy, navigate to the Bolt.diy directory in your terminal, and use the command: git pull This will pull the latest version of the software.
Bolt.diy
Project Details
- kschmelter13/bolt.diy
- MIT License
- Last Updated: 12/12/2024
Recomended MCP Servers
A Pyodide server implementation for the Model Context Protocol (MCP).
An extended version of the MCP server for Todoist integration that enables natural-language task management through Claude.
基于Python的MCP网关
A project for planning bike routes using MCP.
GitLabのカンバンボード操作を行うためのMCPサーバー
An MCP server that can convert Excel (.xls/.xlsx) and Apple Numbers (.numbers) files to PDF
MCP Server for kubernetes management commands
A simple and clear example for implementation and understanding Anthropic MCP (on AWS Bedrock).
MCP server that fetches GitHub Pull Request comments





