Rivet — Visual AI Prompt Workflow IDE
Visual IDE for designing and debugging AI prompt chains. Drag-and-drop nodes for LLM calls, conditionals, loops, and data transforms with real-time execution preview.
What it is
Rivet is a visual development environment for designing AI prompt workflows. Instead of writing prompt chains in code, you build them by dragging and dropping nodes onto a canvas. Nodes represent LLM calls, conditional branches, loops, data transformers, and external API calls. Connections between nodes define the data flow.
Rivet targets prompt engineers, AI product builders, and teams prototyping complex LLM pipelines. The visual approach makes it easier to reason about multi-step prompt chains and debug failures at each node.
How it saves time or tokens
Debugging prompt chains in code requires adding print statements, re-running the entire chain, and manually inspecting intermediate outputs. Rivet shows the output of every node in real time as the chain executes. You can pause at any node, edit the prompt, and re-run from that point — avoiding wasted tokens on upstream nodes that already succeeded.
How to use
- Download and install Rivet from the official website or GitHub releases.
- Create a new project and drag LLM call nodes onto the canvas.
- Connect nodes with edges to define data flow, then press Run.
# Example node graph (conceptual):
Node 1: Text Input -> 'Summarize this article: {{article}}'
Node 2: LLM Call (GPT-4) -> receives Node 1 output
Node 3: Conditional -> if length > 200, route to Node 4; else Node 5
Node 4: LLM Call -> 'Shorten this summary to 2 sentences'
Node 5: Output -> final summary
Example
// Rivet also provides a Node.js SDK for running graphs programmatically
import { runGraph } from '@ironclad/rivet-node';
const result = await runGraph({
graphFile: './my-workflow.rivet',
inputs: {
article: 'The latest advances in transformer architectures...'
},
context: {
openAiKey: process.env.OPENAI_API_KEY
}
});
console.log(result.output);
Related on TokRepo
- Automation tools — Explore more visual workflow builders for AI tasks
- Prompt library — Find reusable prompt templates to use in Rivet nodes
Common pitfalls
- Large graphs with many LLM call nodes can become expensive quickly; use mock nodes during development to avoid burning API credits on every test run.
- Rivet stores API keys in the project file by default; use environment variables instead to avoid accidental key exposure when sharing project files.
- Loop nodes without proper exit conditions can run indefinitely and consume tokens until the API rate limit or budget cap is hit.
Frequently Asked Questions
Rivet is open source and free to download and use. You pay only for the LLM API calls your workflows make. There is no per-seat or subscription fee for the IDE itself.
Yes. Rivet provides a Node.js SDK (@ironclad/rivet-node) that loads and executes .rivet graph files programmatically. You can embed Rivet workflows in your backend services without opening the visual editor.
Rivet supports OpenAI, Anthropic Claude, and any OpenAI-compatible API endpoint. You configure the provider and API key per LLM call node or globally at the project level.
LangChain and LangGraph are code-first frameworks. Rivet is visual-first. Rivet is better for prototyping and debugging prompt chains interactively. LangChain/LangGraph are better for production code where you need full programmatic control and version control diffs.
Rivet project files are JSON-based and can be version-controlled in Git. Multiple team members can work on the same project using standard Git workflows. However, the visual editor does not support real-time collaborative editing like Figma.
Citations (3)
- Rivet GitHub— Rivet is a visual AI programming environment by Ironclad
- Rivet Documentation— Rivet Node.js SDK for running graphs programmatically
- Visual Programming for LLMs (arXiv)— Visual programming reduces debugging time for complex workflows
Related on TokRepo
Source & Thanks
Created by Ironclad. Licensed under MIT.
Ironclad/rivet — 3k+ stars
Discussion
Related Assets
NAPI-RS — Build Node.js Native Addons in Rust
Write high-performance Node.js native modules in Rust with automatic TypeScript type generation and cross-platform prebuilt binaries.
Mamba — Fast Cross-Platform Package Manager
A drop-in conda replacement written in C++ that resolves environments in seconds instead of minutes.
Plasmo — The Browser Extension Framework
Build, test, and publish browser extensions for Chrome, Firefox, and Edge using React or Vue with hot-reload and automatic manifest generation.