Bolt.diy: Your Open Source AI-Powered Full-Stack Web Development Companion
Bolt.diy, previously known as oTToDev and Bolt.new ANY LLM, represents a significant leap forward in AI-assisted web development. This open-source platform empowers developers to harness the power of large language models (LLMs) to prompt, run, edit, and deploy full-stack web applications directly from their browser. Unlike other AI tools, Bolt.diy provides unparalleled flexibility by allowing users to select the specific LLM they wish to use for each prompt, fostering a truly customizable and optimized development experience. Integrations with UBOS unlocks even more power by allowing you to seamlessly integrate agents directly into your applications.
The Core Philosophy: Democratizing AI-Powered Development
Bolt.diy’s core philosophy revolves around democratizing access to AI-powered development tools. By offering an open-source solution, Bolt.diy eliminates the barriers to entry often associated with proprietary platforms, enabling developers of all skill levels to leverage the benefits of AI in their web development workflows. The platform fosters a collaborative environment, encouraging community contributions and ensuring continuous improvement.
Key Features that Redefine Web Development:
- LLM Agnostic Architecture: One of Bolt.diy’s standout features is its LLM-agnostic architecture. This allows developers to choose from a wide array of LLMs, including OpenAI, Anthropic, Ollama, OpenRouter, Gemini, LMStudio, Mistral, xAI, HuggingFace, DeepSeek, and Groq. Furthermore, the platform is designed to be easily extensible, allowing developers to integrate virtually any LLM supported by the Vercel AI SDK.
- Full-Stack Development in the Browser: Bolt.diy leverages StackBlitz’s WebContainers to provide a complete in-browser development environment. This eliminates the need for local setup and allows developers to:
- Install and run npm tools and libraries (e.g., Vite, Next.js)
- Run Node.js servers
- Interact with third-party APIs
- Deploy to production directly from the chat interface
- Share their work via a URL
- Complete Environment Control for AI Agents: Unlike traditional development environments where AI assistance is limited to code generation, Bolt.diy grants AI models complete control over the entire development environment. This includes the filesystem, node server, package manager, terminal, and browser console, enabling AI agents to manage the entire application lifecycle, from creation to deployment.
- Seamless Integration with UBOS Platform: Connecting Bolt.diy to UBOS unlocks powerful AI Agent orchestration capabilities. You can seamlessly connect your Bolt.diy applications to UBOS, enhancing them with sophisticated features such as:
- Agent Orchestration: Manage and coordinate multiple AI Agents to handle complex tasks within your application.
- Enterprise Data Connectivity: Securely connect your agents to your enterprise data sources, providing them with the information they need to make informed decisions.
- Custom AI Agent Building: Build bespoke AI Agents tailored to your specific needs and seamlessly integrate them into your Bolt.diy projects.
- Multi-Agent Systems: Architect sophisticated AI systems where multiple agents collaborate to achieve a common goal.
- Community-Driven Development: Bolt.diy is a community-driven project, with contributions from developers around the world. This ensures that the platform is constantly evolving and improving, with new features and integrations being added regularly.
Use Cases: Unleashing the Potential of AI-Powered Web Development
Bolt.diy caters to a wide range of use cases, empowering developers to build innovative and intelligent web applications with unprecedented speed and efficiency:
- Rapid Prototyping: Quickly generate functional prototypes of web applications using AI-powered code generation and scaffolding. This allows developers to rapidly iterate on their ideas and test different concepts.
- Full-Stack Application Development: Build complete full-stack web applications, from the front-end user interface to the back-end server logic, using AI to automate repetitive tasks and accelerate development.
- AI-Powered Feature Enhancement: Integrate AI-powered features into existing web applications, such as chatbots, recommendation engines, and personalized content delivery systems.
- Code Refactoring and Optimization: Leverage AI to automatically refactor and optimize existing codebases, improving performance, maintainability, and security.
- Learning and Experimentation: Bolt.diy provides an ideal environment for learning and experimenting with AI-powered web development techniques. Developers can explore different LLMs and AI algorithms, and experiment with different approaches to solving common web development challenges.
Getting Started with Bolt.diy: A Quick Guide
Setting up Bolt.diy is straightforward. Here’s a step-by-step guide:
Install Git: Download and install Git from https://git-scm.com/downloads.
Install Node.js: Download and install Node.js from https://nodejs.org/en/download/. Ensure that the path to Node.js is added to your system path.
Clone the Repository: Open a terminal window and clone the Bolt.diy repository using the following command: bash git clone https://github.com/stackblitz-labs/bolt.diy.git
Configure Environment Variables: Rename the
.env.examplefile to.env.localand add your LLM API keys. You’ll need to obtain API keys from the LLM providers you wish to use (e.g., OpenAI, Anthropic, Groq). Ollama does not require an API key as it runs locally.Install Dependencies: Navigate to the Bolt.diy directory in your terminal and install the dependencies using the following command: bash pnpm install
If you encounter a “command not found: pnpm” error, install pnpm globally using: bash sudo npm install -g pnpm
Start the Application: Start the development server using the following command: bash pnpm run dev
Access the Application: Open your web browser and navigate to the address displayed in the terminal (typically
http://localhost:3000).
Run with Docker
An alternative way to run Bolt.diy is with Docker.
Prerequisites
- Git and Node.js installed as mentioned above.
- Docker installed. Find the correct installation for your system at: https://www.docker.com/
1a. Using Helper Scripts
- NPM scripts are provided for convenient building: bash
Development build
npm run dockerbuild
Production build
npm run dockerbuild:prod
1b. Direct Docker Build Commands (alternative to using NPM scripts)
- You can use Docker’s target feature to specify the build environment instead of using NPM scripts if you wish: bash
Development build
docker build . --target bolt-ai-development
Production build
docker build . --target bolt-ai-production
2. Docker Compose with Profiles to Run the Container
- Use Docker Compose profiles to manage different environments: bash
Development environment
docker-compose --profile development up
Production environment
docker-compose --profile production up
Extending Bolt.diy: Contributing to the Open-Source Community
Bolt.diy is an open-source project, and contributions from the community are highly encouraged. If you’re interested in contributing, please refer to the CONTRIBUTING.md file for detailed instructions. Some areas where contributions are particularly welcome include:
- Adding support for new LLMs
- Improving the main Bolt.new prompt in
applib.serverllmprompts.ts - Implementing requested features, such as file syncing, Docker containerization, and GitHub project publishing
- Enhancing the user interface and user experience
- Improving the documentation and providing tutorials
Future Plans and Roadmap: Shaping the Future of AI-Powered Development
The Bolt.diy roadmap outlines ambitious plans for the future, including:
- Preventing Bolt from rewriting files unnecessarily
- Improving prompting for smaller LLMs
- Running agents in the backend instead of a single model call
- Deploying directly to Vercel/Netlify/other similar platforms
- Having LLM plan the project in a MD file for better results/transparency
- VSCode Integration with git-like confirmations
- Uploading documents for knowledge - UI design templates, a code base to reference coding style, etc.
- Voice prompting
- Azure Open AI API Integration
- Perplexity Integration
- Vertex AI Integration
Bolt.diy and UBOS: A Synergistic Partnership
While Bolt.diy excels at providing an AI-powered full-stack development environment, UBOS offers a comprehensive platform for building and managing AI Agents. By integrating Bolt.diy with UBOS, developers can unlock a new level of power and flexibility, enabling them to:
- Orchestrate complex AI workflows involving multiple agents
- Connect AI agents to enterprise data sources securely
- Build custom AI agents tailored to specific business needs
- Deploy and manage AI agents at scale
In conclusion, Bolt.diy represents a groundbreaking approach to web development, empowering developers to leverage the power of AI to build innovative and intelligent applications with unprecedented speed and efficiency. By embracing the open-source philosophy and fostering a vibrant community, Bolt.diy is poised to revolutionize the way web applications are developed and deployed.
Unlock the potential of Bolt.diy with seamless integration into the UBOS platform. Empower your AI Agents by connecting them with your enterprise data, building custom solutions, and orchestrating multi-agent systems. UBOS, the full-stack AI Agent development platform, amplifies the capabilities of Bolt.diy, ensuring every business department can benefit from advanced AI integration.
Bolt.diy
Project Details
- kschmelter13/bolt.diy
- MIT License
- Last Updated: 12/12/2024
Recomended MCP Servers
Documentation Generator MCP Server for automated documentation creation
MCP server for Apple Notes integration with Cursor IDE
A MCP Server for browsing the official Minecraft Wiki!
A Model Context Protocol (MCP) server that provides tools for interacting with Trello boards.
Model Context Protocol Servers
This read-only MCP Server allows you to connect to FTP data from Claude Desktop through CData JDBC Drivers....
One Prompt for Your Agentic MCP Server, Powered by Thousands





