Chainlit — Build Conversational AI Apps Fast
Python framework for building production conversational AI interfaces in minutes. Chat UI, streaming, file upload, feedback, auth, and LangChain/LlamaIndex integration. 12K+ stars.
What it is
Chainlit is a Python framework for building production conversational AI interfaces. It provides a polished chat UI with streaming, file upload, feedback collection, authentication, and data persistence out of the box. It integrates natively with LangChain, LlamaIndex, Haystack, and any Python LLM library.
Chainlit targets Python developers who need a chat interface for their AI application without building frontend code. Apache 2.0 licensed.
How it saves time or tokens
Chainlit eliminates the need to build a frontend for AI applications. Instead of writing React components for chat bubbles, streaming indicators, file uploads, and feedback buttons, you write a Python decorator and Chainlit generates the full UI. Multi-step displays show intermediate thinking steps, tool calls, and chain-of-thought reasoning. Built-in feedback collection (thumbs up/down) on AI responses provides data for fine-tuning without a separate feedback system.
How to use
- Install Chainlit:
pip install chainlit
- Create a minimal chat app:
# app.py
import chainlit as cl
@cl.on_message
async def main(message: cl.Message):
response = f'You said: {message.content}'
await cl.Message(content=response).send()
- Run the app:
chainlit run app.py
# Open http://localhost:8000
Example
A RAG chat app with LangChain and document upload:
import chainlit as cl
from langchain_anthropic import ChatAnthropic
from langchain.chains import RetrievalQA
llm = ChatAnthropic(model='claude-sonnet-4-20250514')
@cl.on_chat_start
async def start():
files = await cl.AskFileMessage(
content='Upload a PDF to chat with',
accept=['application/pdf'],
).send()
# Process uploaded file, create vector store
# Store retriever in session
cl.user_session.set('retriever', retriever)
@cl.on_message
async def main(message: cl.Message):
retriever = cl.user_session.get('retriever')
chain = RetrievalQA.from_chain_type(
llm=llm, retriever=retriever
)
result = await chain.ainvoke(message.content)
await cl.Message(content=result['result']).send()
Related on TokRepo
- AI tools for coding — More AI application frameworks on TokRepo.
- RAG tools — Browse RAG tools that pair with Chainlit.
Common pitfalls
- Not using async functions causes the UI to freeze during LLM calls. Always use async handlers with the @cl decorators.
- Forgetting to set up authentication before deploying publicly exposes your AI app to everyone. Enable OAuth or custom auth before production deployment.
- Not handling file upload errors gracefully. Users may upload unsupported formats or large files; add validation in your on_chat_start handler.
Frequently Asked Questions
Chainlit integrates natively with LangChain, LlamaIndex, Haystack, and any Python LLM library. It provides callback handlers and decorators that work with these frameworks' async patterns.
Yes. Chainlit supports token-level streaming. As the LLM generates tokens, they appear in the chat UI in real time. This works with OpenAI, Anthropic, and other providers that support streaming.
Yes. Chainlit supports OAuth (Google, GitHub) and custom authentication out of the box. Configure auth in the chainlit config file and users must log in before accessing the chat.
Chainlit shows thumbs up/down buttons on each AI response. Users can click to rate the response. Feedback data is stored and can be exported for model evaluation or fine-tuning.
Yes. Chainlit's Step feature shows tool calls, intermediate reasoning, and chain-of-thought in collapsible sections within the chat. This helps users understand how the AI reached its answer.
Citations (3)
- Chainlit GitHub— Chainlit Python framework for conversational AI
- Chainlit Docs— Chainlit documentation and guides
- LangChain Docs— LangChain integration for chat applications
Related on TokRepo
Source & Thanks
Created by Chainlit. Licensed under Apache 2.0. Chainlit/chainlit — 12,000+ GitHub stars
Discussion
Related Assets
NAPI-RS — Build Node.js Native Addons in Rust
Write high-performance Node.js native modules in Rust with automatic TypeScript type generation and cross-platform prebuilt binaries.
Mamba — Fast Cross-Platform Package Manager
A drop-in conda replacement written in C++ that resolves environments in seconds instead of minutes.
Plasmo — The Browser Extension Framework
Build, test, and publish browser extensions for Chrome, Firefox, and Edge using React or Vue with hot-reload and automatic manifest generation.