WorkflowsApr 1, 2026·2 min read

Langflow — Visual AI Agent Builder with API

Langflow is a visual platform for building and deploying AI agents as APIs or MCP servers. 146K+ GitHub stars. Multi-agent orchestration, playground, observability. MIT.

TL;DR
Langflow lets you build AI agents and RAG pipelines by connecting visual blocks, then deploy them as APIs or MCP servers.
§01

What it is

Langflow is an open-source visual platform for building AI agents, RAG pipelines, and multi-agent systems. You design workflows by dragging and connecting blocks in a browser-based canvas -- each block represents an LLM call, a tool, a vector store, or a data transformation. Once built, workflows can be exported as REST APIs or MCP servers.

Langflow targets developers and teams who want to prototype AI workflows visually before committing to code. The playground lets you test and iterate on agents in real time, and the observability panel shows token usage, latency, and execution traces.

§02

How it saves time or tokens

Langflow replaces the cycle of writing Python code, testing in a notebook, and refactoring into production. The visual canvas makes it easy to rearrange workflow steps, swap LLM providers, and add tools without rewriting code. Exporting as an API means your prototype becomes production-ready without a separate engineering effort.

The built-in playground eliminates the need for separate testing scripts. You can send messages to your agent directly from the builder and see how each block processes the input.

§03

How to use

  1. Install Langflow: uv pip install langflow -U.
  2. Start the server: uv run langflow run.
  3. Open http://127.0.0.1:7860 in your browser, create a new flow, and start connecting blocks.
§04

Example

# Install Langflow
uv pip install langflow -U

# Start the server
uv run langflow run

# Open http://127.0.0.1:7860
# Create a flow:
# 1. Add an 'OpenAI' LLM block
# 2. Connect a 'Prompt' block with your system message
# 3. Add a 'Chat Input' and 'Chat Output' block
# 4. Connect them: Input -> Prompt -> LLM -> Output
# 5. Click the playground button to test
§05

Related on TokRepo

§06

Common pitfalls

  • Visual workflows can become hard to maintain at scale; consider breaking complex agents into sub-flows for readability.
  • The default installation includes all provider integrations which increases install size; use langflow[minimal] if you only need specific providers.
  • API export generates a single endpoint; for production deployments with authentication and rate limiting, place a reverse proxy in front of the Langflow API.

Frequently Asked Questions

Can I deploy Langflow workflows as APIs?+

Yes. Every Langflow workflow can be exported as a REST API endpoint. You can also deploy workflows as MCP servers, making them accessible to AI agents that support the Model Context Protocol.

What LLM providers does Langflow support?+

Langflow supports OpenAI, Anthropic, Google, Azure, Ollama, and many other LLM providers. Each provider is represented as a block that you can drag into your workflow canvas.

Does Langflow support multi-agent workflows?+

Yes. Langflow supports multi-agent orchestration where multiple agents can collaborate, delegate tasks, and share context within a single workflow. You can connect agent blocks in sequence or parallel.

Is Langflow free?+

Langflow is open-source under the MIT license and free to self-host. DataStax offers a managed cloud version with additional features, but the core platform is fully functional when self-hosted.

Can I use Langflow with local LLMs?+

Yes. Langflow integrates with Ollama and other local LLM runners. You can build workflows that use entirely local models with no API calls to external providers.

Citations (3)
🙏

Source & Thanks

Created by Langflow AI. Licensed under MIT. langflow-ai/langflow — 146,000+ GitHub stars

Discussion

Sign in to join the discussion.
No comments yet. Be the first to share your thoughts.