Hosting Agent Apps with Streamlit + FastAPI: Rapid UI for Intelligent Workflows

January 3, 2025

Rapid UI for Intelligent Workflows

Your agent might be smartβ€”but it needs a user interface to be useful. This guide walks you through combining Streamlit for the frontend and FastAPI for the backend to host your LangChain or CrewAI-powered agent. Build, test, and deploy intelligent workflows with a fast, reactive UI and a clean API layer.


🧠 Why Streamlit + FastAPI?

Building agentic applications requires:

  • Frontend: A way for users to input tasks, view responses, and interact in real time
  • Backend: A secure, scalable layer to handle agent logic, prompt execution, and tool integration

Streamlit gives you a fast, Pythonic UI layer perfect for internal tools and dashboards.
FastAPI provides an async-ready, production-grade backend for your agent orchestration.

When combined, they allow you to:

  • βœ… Take user input
  • βœ… Call agents with tools and memory
  • βœ… Return results in real-time
  • βœ… Serve via REST endpoints (or host locally)

What You’ll Build

A Research Assistant App where:

  • The user enters a query (e.g., “What’s new with LangGraph?”)
  • The app runs an agent to research the topic using DuckDuckGo
  • The backend (FastAPI) handles all LLM calls
  • The frontend (Streamlit) provides the interactive UI

πŸ“¦ Tools You’ll Use

ToolRole
StreamlitUI layer for interaction
FastAPIBackend agent logic + API
CrewAI or LangChainAgent orchestration
OpenAI / Hugging FaceLLMs powering reasoning

πŸ›  Step-by-Step Guide


βœ… Step 1: Install Dependencies

bashCopyEditpip install streamlit fastapi uvicorn crewai langchain openai duckduckgo-search

βœ… Step 2: Create Your Backend with FastAPI

agent_backend.py

pythonCopyEditfrom fastapi import FastAPI, Request
from pydantic import BaseModel
from crewai import Agent, Task, Crew
from langchain.chat_models import ChatOpenAI
from langchain.tools import DuckDuckGoSearchRun

app = FastAPI()

class QueryRequest(BaseModel):
    question: str

@app.post("/run-agent")
async def run_agent(data: QueryRequest):
    llm = ChatOpenAI(model="gpt-3.5-turbo")
    search = DuckDuckGoSearchRun()

    researcher = Agent(
        role="Researcher",
        goal=f"Find detailed info about: {data.question}",
        tools=[search],
        llm=llm
    )

    writer = Agent(
        role="Writer",
        goal="Write a clear and helpful summary",
        llm=llm
    )

    task1 = Task("Do the research", agent=researcher)
    task2 = Task("Write summary from findings", agent=writer)
    crew = Crew(agents=[researcher, writer], tasks=[task1, task2])

    output = crew.kickoff()
    return {"response": output}

βœ… Step 3: Run Your FastAPI Backend

bashCopyEdituvicorn agent_backend:app --reload

βœ… Step 4: Create the Frontend with Streamlit

app_frontend.py

pythonCopyEditimport streamlit as st
import requests

st.set_page_config(page_title="Agent Assistant", layout="centered")

st.title("πŸ” AI Research Assistant")
st.write("Ask a question, and let the agent research and summarize for you.")

user_input = st.text_input("Enter your query:")

if st.button("Run Agent"):
    with st.spinner("Thinking..."):
        response = requests.post("http://localhost:8000/run-agent", json={"question": user_input})
        result = response.json()
        st.success("Done!")
        st.markdown("### Agent Response")
        st.write(result["response"])

βœ… Step 5: Launch the App

  1. Start the backend:
bashCopyEdituvicorn agent_backend:app --reload
  1. Run the frontend:
bashCopyEditstreamlit run app_frontend.py

πŸŽ‰ You now have a working agentic app with a frontend + backend!


πŸ“¦ Optional Enhancements

FeatureHow-To
πŸ”„ Real-time streaming outputUse FastAPI + LangChain’s streaming support
🧠 MemoryAdd FAISS or Chroma for vector memory
πŸ” API keys/secretsUse .env or Streamlit secrets manager
πŸ“ File uploadsLet users upload PDFs and retrieve insights
🌐 DeploymentUse Streamlit Cloud, Hugging Face Spaces, or Render.com

πŸ“š Additional Resources

Leave a Reply

Your email address will not be published. Required fields are marked *