Dify — Open-Source LLMOps Platform
Visual platform for building AI applications. Drag-and-drop workflow editor, RAG pipeline, agent builder, and API deployment. Self-hostable alternative to Zapier AI.
What it is
Dify is a visual platform for building production AI applications without deep coding. It provides a drag-and-drop workflow editor, RAG pipeline, agent builder, prompt IDE, and automatic API deployment. Every app gets a REST API endpoint automatically.
Dify works with OpenAI, Anthropic, Google, Azure, Ollama, HuggingFace, and 50+ model providers. It is self-hostable via Docker with full data control.
How it saves time or tokens
Dify lets non-engineers build AI applications that would otherwise require a full-stack developer. The visual workflow builder connects LLM calls, conditions, loops, and tools as nodes on a canvas. The RAG engine handles document upload, chunking, embedding, and retrieval with citation. Instead of writing custom orchestration code, you design workflows visually and deploy them as APIs. The prompt IDE lets you test and iterate prompts with variable injection before deploying.
How to use
- Deploy Dify with Docker:
git clone https://github.com/langgenius/dify.git
cd dify/docker
cp .env.example .env
docker compose up -d
- Open http://localhost/install to create your admin account.
- Create your first AI app:
- Choose a template (chatbot, text generation, workflow)
- Configure your LLM provider (API key)
- Design the workflow with visual nodes
- Publish and get your API endpoint
Example
Using the Dify API to interact with your published workflow:
import requests
response = requests.post(
'http://localhost/v1/chat-messages',
headers={
'Authorization': 'Bearer app-your-api-key',
'Content-Type': 'application/json'
},
json={
'inputs': {},
'query': 'What are the key features of our product?',
'response_mode': 'streaming',
'user': 'user-123'
},
stream=True
)
for line in response.iter_lines():
if line:
print(line.decode())
Related on TokRepo
- AI tools for no-code — More visual AI building tools on TokRepo.
- RAG tools — Browse retrieval-augmented generation tools.
Common pitfalls
- Running Dify without configuring persistent storage means uploaded documents and conversation history are lost on restart. Configure volume mounts in Docker Compose.
- Not setting rate limits on published API endpoints exposes your LLM API keys to abuse. Configure API key authentication and rate limiting before sharing endpoints.
- The default Docker setup uses an embedded database. For production, configure an external PostgreSQL and Redis instance.
Frequently Asked Questions
Dify supports 50+ model providers including OpenAI, Anthropic (Claude), Google (Gemini), Azure OpenAI, Ollama, HuggingFace, Mistral, Cohere, and local models. Add providers through the settings UI with API keys.
Yes. Dify is fully self-hostable via Docker Compose. Clone the repository, configure environment variables, and run docker compose up. All data stays on your infrastructure.
Upload documents (PDF, DOCX, TXT, Markdown), Dify chunks them automatically, generates embeddings, and stores them in a vector database. When users query, Dify retrieves relevant chunks and includes them as context for the LLM, with citations.
Yes. When you publish a Dify app (chatbot, workflow, or text generation), it automatically gets a REST API endpoint with authentication. You can integrate the API into any application.
LangChain is a coding framework (Python/JS) for developers. Dify is a visual platform that lets non-developers build AI apps. Dify uses LLM orchestration concepts similar to LangChain but wraps them in a GUI with built-in hosting.
Citations (3)
- Dify GitHub— Dify open-source LLMOps platform
- Dify Docs— Dify documentation and API reference
- Anthropic RAG Guide— RAG architecture for AI applications
Related on TokRepo
Source & Thanks
Created by LangGenius. Licensed under Apache 2.0. langgenius/dify — 55K+ GitHub stars
Discussion
Related Assets
NAPI-RS — Build Node.js Native Addons in Rust
Write high-performance Node.js native modules in Rust with automatic TypeScript type generation and cross-platform prebuilt binaries.
Mamba — Fast Cross-Platform Package Manager
A drop-in conda replacement written in C++ that resolves environments in seconds instead of minutes.
Plasmo — The Browser Extension Framework
Build, test, and publish browser extensions for Chrome, Firefox, and Edge using React or Vue with hot-reload and automatic manifest generation.