
🏖️ Trip Planner: Streamlit with CrewAI

Introduction
Trip Planner leverages the CrewAI framework to automate and enhance the trip planning experience, integrating a CLI, FASTAPI, and a user-friendly Streamlit interface.
CrewAI Framework
CrewAI simplifies the orchestration of role-playing AI agents. In VacAIgent, these agents collaboratively decide on cities and craft a complete itinerary for your trip based on specified preferences, all accessible via a streamlined Streamlit user interface.
Running the Application
To experience the VacAIgent app:
Configure Environment: Set up the environment variables for Browseless, Serper, and OpenAI. Use the
secrets.exampleas a guide to add your keys then move that file (secrets.toml) to.streamlit/secrets.toml.Install Dependencies: Execute
pip install -r requirements.txtin your terminal.Launch the CLI Mode: Run
python cli_app.py -o "Bangalore, India" -d "Krabi, Thailand" -s 2024-05-01 -e 2024-05-10 -i "2 adults who love swimming, dancing, hiking, shopping, food, water sports adventures, rock climbing"to start the CLI Mode.Launch the FASTAPI: Run
uvicorn api_app:app --reloadto start the FASTAPI server.Launch the Streamlit App: Run
streamlit run streamlit_app.pyto start the Streamlit interface.
★ Disclaimer: The application uses GEMINI by default. Ensure you have access to GEMINI’s API and be aware of the associated costs.
Details & Explanation
- Streamlit UI: The Streamlit interface is implemented in
streamlit_app.py, where users can input their trip details. - Components:
./trip_tasks.py: Contains task prompts for the agents../trip_agents.py: Manages the creation of agents../tools directory: Houses tool classes used by agents../streamlit_app.py: The heart of the Streamlit app.
Using LLM Models
To switch LLMs from differnet Providers
class TripAgents():
def __init__(self, llm: BaseChatModel = None):
if llm is None:
#self.llm = LLM(model="groq/deepseek-r1-distill-llama-70b")
self.llm = LLM(model="gemini/gemini-2.0-flash")
else:
self.llm = llm
Connect to LLMs
Integrating Ollama with CrewAI
Pass the Ollama model to agents in the CrewAI framework:
agent = Agent(
role='Local AI Expert',
goal='Process information using a local model',
backstory="An AI assistant running on local hardware.",
llm=LLM(model="ollama/llama3.2", base_url="http://localhost:11434")
)
License
Trip Planner is open-sourced under the MIT License.
Trip Planner
Project Details
- Dips1902/Trip_Advisor
- MIT License
- Last Updated: 4/5/2025
Recomended MCP Servers
A MCP server that can create queries and fetch information from APi documentation.
Vue3+Vite+Ts+Pinia+...一个快速开发vue3的模板框架
A monorepo for the Google Workspace Dev Assist project, providing an MCP server with tools for AI assistants...
Kaggle-MCP: Connect Claude AI to the Kaggle API through the Model Context Protocol (MCP), enabling competition, dataset, and...
Ollamaを使用してUnityを操作するやつ
Experimental - Model Context Protocol (MCP) server for the Nylas API
Natural language end to end testing framework
Serveur MCP avancé pour Firebase Firestore avec support pour toutes les fonctionnalités avancées





