January 3, 2025
Rapid UI for Intelligent Workflows
Your agent might be smartβbut it needs a user interface to be useful. This guide walks you through combining Streamlit for the frontend and FastAPI for the backend to host your LangChain or CrewAI-powered agent. Build, test, and deploy intelligent workflows with a fast, reactive UI and a clean API layer.
Building agentic applications requires:
Streamlit gives you a fast, Pythonic UI layer perfect for internal tools and dashboards.
FastAPI provides an async-ready, production-grade backend for your agent orchestration.
When combined, they allow you to:
A Research Assistant App where:
| Tool | Role |
|---|---|
| Streamlit | UI layer for interaction |
| FastAPI | Backend agent logic + API |
| CrewAI or LangChain | Agent orchestration |
| OpenAI / Hugging Face | LLMs powering reasoning |
bashCopyEditpip install streamlit fastapi uvicorn crewai langchain openai duckduckgo-searchagent_backend.py
pythonCopyEditfrom fastapi import FastAPI, Request
from pydantic import BaseModel
from crewai import Agent, Task, Crew
from langchain.chat_models import ChatOpenAI
from langchain.tools import DuckDuckGoSearchRun
app = FastAPI()
class QueryRequest(BaseModel):
question: str
@app.post("/run-agent")
async def run_agent(data: QueryRequest):
llm = ChatOpenAI(model="gpt-3.5-turbo")
search = DuckDuckGoSearchRun()
researcher = Agent(
role="Researcher",
goal=f"Find detailed info about: {data.question}",
tools=[search],
llm=llm
)
writer = Agent(
role="Writer",
goal="Write a clear and helpful summary",
llm=llm
)
task1 = Task("Do the research", agent=researcher)
task2 = Task("Write summary from findings", agent=writer)
crew = Crew(agents=[researcher, writer], tasks=[task1, task2])
output = crew.kickoff()
return {"response": output}bashCopyEdituvicorn agent_backend:app --reloadapp_frontend.py
pythonCopyEditimport streamlit as st
import requests
st.set_page_config(page_title="Agent Assistant", layout="centered")
st.title("π AI Research Assistant")
st.write("Ask a question, and let the agent research and summarize for you.")
user_input = st.text_input("Enter your query:")
if st.button("Run Agent"):
with st.spinner("Thinking..."):
response = requests.post("http://localhost:8000/run-agent", json={"question": user_input})
result = response.json()
st.success("Done!")
st.markdown("### Agent Response")
st.write(result["response"])bashCopyEdituvicorn agent_backend:app --reload
bashCopyEditstreamlit run app_frontend.py
π You now have a working agentic app with a frontend + backend!
| Feature | How-To |
|---|---|
| π Real-time streaming output | Use FastAPI + LangChainβs streaming support |
| π§ Memory | Add FAISS or Chroma for vector memory |
| π API keys/secrets | Use .env or Streamlit secrets manager |
| π File uploads | Let users upload PDFs and retrieve insights |
| π Deployment | Use Streamlit Cloud, Hugging Face Spaces, or Render.com |