ScriptsMar 31, 2026·2 min read

Chainlit — Build Conversational AI Apps Fast

Python framework for building production conversational AI interfaces in minutes. Chat UI, streaming, file upload, feedback, auth, and LangChain/LlamaIndex integration. 12K+ stars.

TL;DR
Chainlit provides a polished chat UI for Python AI apps with streaming, auth, and feedback built in.
§01

What it is

Chainlit is a Python framework for building production conversational AI interfaces. It provides a polished chat UI with streaming, file upload, feedback collection, authentication, and data persistence out of the box. It integrates natively with LangChain, LlamaIndex, Haystack, and any Python LLM library.

Chainlit targets Python developers who need a chat interface for their AI application without building frontend code. Apache 2.0 licensed.

§02

How it saves time or tokens

Chainlit eliminates the need to build a frontend for AI applications. Instead of writing React components for chat bubbles, streaming indicators, file uploads, and feedback buttons, you write a Python decorator and Chainlit generates the full UI. Multi-step displays show intermediate thinking steps, tool calls, and chain-of-thought reasoning. Built-in feedback collection (thumbs up/down) on AI responses provides data for fine-tuning without a separate feedback system.

§03

How to use

  1. Install Chainlit:
pip install chainlit
  1. Create a minimal chat app:
# app.py
import chainlit as cl

@cl.on_message
async def main(message: cl.Message):
    response = f'You said: {message.content}'
    await cl.Message(content=response).send()
  1. Run the app:
chainlit run app.py
# Open http://localhost:8000
§04

Example

A RAG chat app with LangChain and document upload:

import chainlit as cl
from langchain_anthropic import ChatAnthropic
from langchain.chains import RetrievalQA

llm = ChatAnthropic(model='claude-sonnet-4-20250514')

@cl.on_chat_start
async def start():
    files = await cl.AskFileMessage(
        content='Upload a PDF to chat with',
        accept=['application/pdf'],
    ).send()
    # Process uploaded file, create vector store
    # Store retriever in session
    cl.user_session.set('retriever', retriever)

@cl.on_message
async def main(message: cl.Message):
    retriever = cl.user_session.get('retriever')
    chain = RetrievalQA.from_chain_type(
        llm=llm, retriever=retriever
    )
    result = await chain.ainvoke(message.content)
    await cl.Message(content=result['result']).send()
§05

Related on TokRepo

§06

Common pitfalls

  • Not using async functions causes the UI to freeze during LLM calls. Always use async handlers with the @cl decorators.
  • Forgetting to set up authentication before deploying publicly exposes your AI app to everyone. Enable OAuth or custom auth before production deployment.
  • Not handling file upload errors gracefully. Users may upload unsupported formats or large files; add validation in your on_chat_start handler.

Frequently Asked Questions

What frameworks does Chainlit integrate with?+

Chainlit integrates natively with LangChain, LlamaIndex, Haystack, and any Python LLM library. It provides callback handlers and decorators that work with these frameworks' async patterns.

Does Chainlit support streaming?+

Yes. Chainlit supports token-level streaming. As the LLM generates tokens, they appear in the chat UI in real time. This works with OpenAI, Anthropic, and other providers that support streaming.

Can I add authentication to Chainlit apps?+

Yes. Chainlit supports OAuth (Google, GitHub) and custom authentication out of the box. Configure auth in the chainlit config file and users must log in before accessing the chat.

How does feedback collection work?+

Chainlit shows thumbs up/down buttons on each AI response. Users can click to rate the response. Feedback data is stored and can be exported for model evaluation or fine-tuning.

Can Chainlit display intermediate steps?+

Yes. Chainlit's Step feature shows tool calls, intermediate reasoning, and chain-of-thought in collapsible sections within the chat. This helps users understand how the AI reached its answer.

Citations (3)
🙏

Source & Thanks

Created by Chainlit. Licensed under Apache 2.0. Chainlit/chainlit — 12,000+ GitHub stars

Discussion

Sign in to join the discussion.
No comments yet. Be the first to share your thoughts.

Related Assets