AutoGen
AutoGen is a framework for creating multi-agent AI applications that can act autonomously or work alongside humans.
Installation
AutoGen requires Python 3.10 or later.
# Install AgentChat and OpenAI client from Extensions
pip install -U "autogen-agentchat" "autogen-ext[openai]"
The current stable version is v0.4. If you are upgrading from AutoGen v0.2, please refer to the Migration Guide for detailed instructions on how to update your code and configurations.
# Install AutoGen Studio for no-code GUI
pip install -U "autogenstudio"
Quickstart
Hello World
Create an assistant agent using OpenAI’s GPT-4o model.
import asyncio
from autogen_agentchat.agents import AssistantAgent
from autogen_ext.models.openai import OpenAIChatCompletionClient
async def main() -> None:
agent = AssistantAgent("assistant", OpenAIChatCompletionClient(model="gpt-4o"))
print(await agent.run(task="Say 'Hello World!'"))
asyncio.run(main())
Team
Create a group chat team with an assistant agent, a web surfer agent, and a user proxy agent for web browsing tasks. You need to install playwright.
# pip install -U autogen-agentchat autogen-ext[openai,web-surfer]
# playwright install
import asyncio
from autogen_agentchat.agents import AssistantAgent, UserProxyAgent
from autogen_agentchat.conditions import TextMentionTermination
from autogen_agentchat.teams import RoundRobinGroupChat
from autogen_agentchat.ui import Console
from autogen_ext.models.openai import OpenAIChatCompletionClient
from autogen_ext.agents.web_surfer import MultimodalWebSurfer
async def main() -> None:
model_client = OpenAIChatCompletionClient(model="gpt-4o")
assistant = AssistantAgent("assistant", model_client)
web_surfer = MultimodalWebSurfer("web_surfer", model_client)
user_proxy = UserProxyAgent("user_proxy")
termination = TextMentionTermination("exit") # Type 'exit' to end the conversation.
team = RoundRobinGroupChat([web_surfer, assistant, user_proxy], termination_condition=termination)
await Console(team.run_stream(task="Find information about AutoGen and write a short summary."))
asyncio.run(main())
AutoGen Studio
Use AutoGen Studio to prototype and run multi-agent workflows without writing code.
# Run AutoGen Studio on http://localhost:8080
autogenstudio ui --port 8080 --appdir ./my-app
Why Use AutoGen?

The AutoGen ecosystem provides everything you need to create AI agents, especially multi-agent workflows – framework, developer tools, and applications.
The framework uses a layered and extensible design. Layers have clearly divided responsibilities and build on top of layers below. This design enables you to use the framework at different levels of abstraction, from high-level APIs to low-level components.
- Core API implements message passing, event-driven agents, and local and distributed runtime for flexibility and power. It also support cross-language support for .NET and Python.
- AgentChat API implements a simpler but opinionated API rapid for prototyping. This API is built on top of the Core API and is closest to what users of v0.2 are familiar with and supports familiar multi-agent patterns such as two-agent chat or group chats.
- Extensions API enables first- and third-party extensions continuously expanding framework capabilities. It support specific implementation of LLM clients (e.g., OpenAI, AzureOpenAI), and capabilities such as code execution.
The ecosystem also supports two essential developer tools:

- AutoGen Studio provides a no-code GUI for building multi-agent applications.
- AutoGen Bench provides a benchmarking suite for evaluating agent performance.
You can use the AutoGen framework and developer tools to create applications for your domain. For example, Magentic-One is a state-of-art multi-agent team built using AgentChat API and Extensions API that can handle variety of tasks that require web browsing, code execution, and file handling.
With AutoGen you get to join and contribute to a thriving ecosystem. We host weekly office hours and talks with maintainers and community. We also have a Discord server for real-time chat, GitHub Discussions for Q&A, and a blog for tutorials and updates.
Where to go next?
| Installation | * | ||
| Quickstart | * | ||
| Tutorial | * | ||
| API Reference | * | ||
| Packages | * |
*Releasing soon
Interested in contributing? See CONTRIBUTING.md for guidelines on how to get started. We welcome contributions of all kinds, including bug fixes, new features, and documentation improvements. Join our community and help us make AutoGen better!
Have questions? Check out our Frequently Asked Questions (FAQ) for answers to common queries. If you don’t find what you’re looking for, feel free to ask in our GitHub Discussions or join our Discord server for real-time support.
Legal Notices
Microsoft and any contributors grant you a license to the Microsoft documentation and other content in this repository under the Creative Commons Attribution 4.0 International Public License, see the LICENSE file, and grant you a license to any code in the repository under the MIT License, see the LICENSE-CODE file.
Microsoft, Windows, Microsoft Azure, and/or other Microsoft products and services referenced in the documentation may be either trademarks or registered trademarks of Microsoft in the United States and/or other countries. The licenses for this project do not grant you rights to use any Microsoft names, logos, or trademarks. Microsoft’s general trademark guidelines can be found at http://go.microsoft.com/fwlink/?LinkID=254653.
Privacy information can be found at https://go.microsoft.com/fwlink/?LinkId=521839
Microsoft and any contributors reserve all other rights, whether under their respective copyrights, patents, or trademarks, whether by implication, estoppel, or otherwise.
AutoGen
Project Details
- tan-wen-yuan/autogen
- Creative Commons Attribution 4.0 International
- Last Updated: 1/23/2025
Recomended MCP Servers
MCP Server to interact with Kanka Campaign
A NL2SQL plugin based on FocusSearch keyword parsing, offering greater accuracy, higher speed, and more reliability!
Tiny MCP server with cryptography tools, sufficient to establish end-to-end encryption between LLM agents
A Model Context Protocol (MCP) implementation that enables Claude Desktop to interact with Azure services. This integration allows...
Enhanced MCP server for Google Workspace with Google Meet integration and bug fixes
MCP server for discord bot - adds one tool with raw API access
Implementation of Anthropic's MCP protocol for Firebird databases.
这是个示范
AI Agents & MCPs & AI Workflow Automation • (280+ MCP servers for AI agents) • AI Automation...
Mcp server in typescript to connect with Jira Analyze the issues
A MCP server for interacting with Bear note-taking software.
MCP server for GNU Radio





