Griptape Structures & Patterns
Three Core Structures
from griptape.structures import Agent, Pipeline, Workflow
# 1. Agent — Single task with tools
agent = Agent(tools=[WebScraper(), Calculator()])
agent.run("What's 15% tip on a $85 dinner?")
# 2. Pipeline — Sequential multi-step tasks
pipeline = Pipeline()
pipeline.add_tasks(research_task, analyze_task, report_task)
pipeline.run("Analyze Q4 sales data")
# 3. Workflow — Parallel + sequential DAG
workflow = Workflow()
workflow.add_tasks(task_a, task_b, task_c) # a,b run in parallel
task_c.add_parents([task_a, task_b]) # c waits for both
workflow.run()Off-Prompt Processing
Griptape's signature feature — keep sensitive data out of the LLM:
from griptape.structures import Agent
from griptape.drivers import LocalVectorStoreDriver
from griptape.engines import VectorQueryEngine
# Data stays in local vector store, never sent to LLM
vector_store = LocalVectorStoreDriver()
query_engine = VectorQueryEngine(vector_store_driver=vector_store)
agent = Agent(
tools=[
# Tool processes data locally, sends only summaries to LLM
VectorStoreTool(query_engine=query_engine, off_prompt=True)
]
)
agent.run("What are the key risks in our financial report?")
# LLM sees: "Based on the retrieved context: [summary]"
# LLM never sees: raw financial dataTool System
Pre-built tools for common agent tasks:
| Tool | Capability |
|---|---|
| WebScraper | Fetch and parse web pages |
| Calculator | Mathematical operations |
| FileManager | Read/write local files |
| SqlClient | Query SQL databases |
| VectorStoreTool | Semantic search over documents |
| AwsS3Client | Read/write S3 objects |
| GoogleCalendar | Manage calendar events |
| EmailClient | Send and read emails |
| RestApiClient | Call any REST API |
Build custom tools:
from griptape.tools import BaseTool
from griptape.artifacts import TextArtifact
from griptape.utils.decorators import activity
from schema import Schema, Literal
class StockPriceTool(BaseTool):
@activity(
config={"description": "Get current stock price"},
schema=Schema({Literal("symbol"): str})
)
def get_price(self, params):
symbol = params["values"]["symbol"]
price = fetch_stock_price(symbol)
return TextArtifact(f"{symbol}: ${price}")Memory & Conversation
from griptape.structures import Agent
from griptape.memory.structure import ConversationMemory
from griptape.memory.task.storage import LocalDiskTaskMemoryDriver
# Persistent conversation memory
agent = Agent(
conversation_memory=ConversationMemory(
driver=LocalDiskTaskMemoryDriver(file_path="memory.json")
)
)
agent.run("My name is Alice and I work on ML pipelines")
# Later...
agent.run("What do you know about me?")
# -> "You're Alice, you work on ML pipelines"Multi-LLM Support
from griptape.drivers import AnthropicPromptDriver, OpenAiChatPromptDriver
# Use Claude for reasoning
agent = Agent(
prompt_driver=AnthropicPromptDriver(model="claude-sonnet-4-6")
)
# Or OpenAI
agent = Agent(
prompt_driver=OpenAiChatPromptDriver(model="gpt-4o")
)
# Or local via Ollama
from griptape.drivers import OllamaPromptDriver
agent = Agent(
prompt_driver=OllamaPromptDriver(model="llama3")
)FAQ
Q: What is Griptape? A: Griptape is a modular Python framework with 2,500+ GitHub stars for building AI agents with composable structures (Agent, Pipeline, Workflow), pluggable tools, and Off-Prompt data processing for secure enterprise use.
Q: What makes Griptape different from LangChain or CrewAI? A: Griptape's key differentiator is Off-Prompt processing — sensitive data is processed locally and only summaries reach the LLM. It also has a cleaner modular architecture with three distinct structures, versus LangChain's sprawling API surface. CrewAI focuses on multi-agent roles; Griptape focuses on secure, composable pipelines.
Q: Is Griptape free? A: Yes, the framework is open-source under Apache-2.0. Griptape Cloud (managed hosting) is a paid service with a free tier.