✨ From vibe coding to vibe deployment. UBOS MCP turns ideas into infra with one message.

Learn more

CrewAI

🏖️ Trip Planner: Streamlit with CrewAI

Streamlit App

Introduction

Trip Planner leverages the CrewAI framework to automate and enhance the trip planning experience, integrating a CLI, FASTAPI, and a user-friendly Streamlit interface.

CrewAI Framework

CrewAI simplifies the orchestration of role-playing AI agents. In VacAIgent, these agents collaboratively decide on cities and craft a complete itinerary for your trip based on specified preferences, all accessible via a streamlined Streamlit user interface.

Running the Application

To experience the VacAIgent app:

  • Configure Environment: Set up the environment variables for Browseless, Serper, and OpenAI. Use the secrets.example as a guide to add your keys then move that file (secrets.toml) to .streamlit/secrets.toml.

  • Install Dependencies: Execute pip install -r requirements.txt in your terminal.

  • Launch the CLI Mode: Run python cli_app.py -o "Bangalore, India" -d "Krabi, Thailand" -s 2024-05-01 -e 2024-05-10 -i "2 adults who love swimming, dancing, hiking, shopping, food, water sports adventures, rock climbing" to start the CLI Mode.

  • Launch the FASTAPI: Run uvicorn api_app:app --reload to start the FASTAPI server.

  • Launch the Streamlit App: Run streamlit run streamlit_app.py to start the Streamlit interface.

Disclaimer: The application uses GEMINI by default. Ensure you have access to GEMINI’s API and be aware of the associated costs.

Details & Explanation

  • Streamlit UI: The Streamlit interface is implemented in streamlit_app.py, where users can input their trip details.
  • Components:
    • ./trip_tasks.py: Contains task prompts for the agents.
    • ./trip_agents.py: Manages the creation of agents.
    • ./tools directory: Houses tool classes used by agents.
    • ./streamlit_app.py: The heart of the Streamlit app.

Using LLM Models

To switch LLMs from differnet Providers

class TripAgents():
    def __init__(self, llm: BaseChatModel = None):
        if llm is None:
            #self.llm = LLM(model="groq/deepseek-r1-distill-llama-70b")
            self.llm = LLM(model="gemini/gemini-2.0-flash")
        else:
            self.llm = llm

Connect to LLMs

Integrating Ollama with CrewAI

Pass the Ollama model to agents in the CrewAI framework:

    agent = Agent(
        role='Local AI Expert',
        goal='Process information using a local model',
        backstory="An AI assistant running on local hardware.",
        llm=LLM(model="ollama/llama3.2", base_url="http://localhost:11434")
    )

License

Trip Planner is open-sourced under the MIT License.

Featured Templates

View More
AI Characters
Your Speaking Avatar
168 684
Customer service
Service ERP
125 755
AI Assistants
Talk with Claude 3
156 1165
Verified Icon
AI Agents
AI Chatbot Starter Kit
1308 6080 5.0

Start your free trial

Build your solution today. No credit card required.

Sign In

Register

Reset Password

Please enter your username or email address, you will receive a link to create a new password via email.