# Griptape — Modular Python AI Agent Framework > Build AI agents with composable structures, tools, and memory. Off-Prompt data processing for secure enterprise use. 2.5K+ stars. ## Install Save the content below to `.claude/skills/` or append to your `CLAUDE.md`: # Griptape — Modular Python AI Agent Framework ## Quick Use ```bash pip install griptape[all] ``` ```python from griptape.structures import Agent from griptape.tools import WebScraper, Calculator, FileManager # Create an agent with tools agent = Agent( tools=[ WebScraper(), Calculator(), FileManager(), ] ) agent.run("Scrape the Hacker News front page, calculate the average score of the top 5 posts, and save the result to results.txt") ``` Set your API key: ```bash export OPENAI_API_KEY="sk-..." # Or for Claude: export ANTHROPIC_API_KEY="sk-ant-..." ``` --- ## Intro Griptape is a modular Python framework with 2,500+ GitHub stars for building AI agents, pipelines, and workflows. Its key innovation is **Off-Prompt** — a pattern where sensitive data is processed outside the LLM context window, enabling secure enterprise deployments. Griptape provides three composable structures (Agent, Pipeline, Workflow) that can be mixed with pluggable tools, memory modules, and any LLM backend. Backed by a company offering Griptape Cloud for managed deployment, it bridges the gap between experimental AI agents and production enterprise systems. Works with: OpenAI, Anthropic Claude, Google Gemini, AWS Bedrock, Azure OpenAI, Cohere, HuggingFace, Ollama. Best for enterprise teams building AI agents that handle sensitive data securely. Setup time: under 3 minutes. --- ## Griptape Structures & Patterns ### Three Core Structures ```python from griptape.structures import Agent, Pipeline, Workflow # 1. Agent — Single task with tools agent = Agent(tools=[WebScraper(), Calculator()]) agent.run("What's 15% tip on a $85 dinner?") # 2. Pipeline — Sequential multi-step tasks pipeline = Pipeline() pipeline.add_tasks(research_task, analyze_task, report_task) pipeline.run("Analyze Q4 sales data") # 3. Workflow — Parallel + sequential DAG workflow = Workflow() workflow.add_tasks(task_a, task_b, task_c) # a,b run in parallel task_c.add_parents([task_a, task_b]) # c waits for both workflow.run() ``` ### Off-Prompt Processing Griptape's signature feature — keep sensitive data out of the LLM: ```python from griptape.structures import Agent from griptape.drivers import LocalVectorStoreDriver from griptape.engines import VectorQueryEngine # Data stays in local vector store, never sent to LLM vector_store = LocalVectorStoreDriver() query_engine = VectorQueryEngine(vector_store_driver=vector_store) agent = Agent( tools=[ # Tool processes data locally, sends only summaries to LLM VectorStoreTool(query_engine=query_engine, off_prompt=True) ] ) agent.run("What are the key risks in our financial report?") # LLM sees: "Based on the retrieved context: [summary]" # LLM never sees: raw financial data ``` ### Tool System Pre-built tools for common agent tasks: | Tool | Capability | |------|-----------| | **WebScraper** | Fetch and parse web pages | | **Calculator** | Mathematical operations | | **FileManager** | Read/write local files | | **SqlClient** | Query SQL databases | | **VectorStoreTool** | Semantic search over documents | | **AwsS3Client** | Read/write S3 objects | | **GoogleCalendar** | Manage calendar events | | **EmailClient** | Send and read emails | | **RestApiClient** | Call any REST API | Build custom tools: ```python from griptape.tools import BaseTool from griptape.artifacts import TextArtifact from griptape.utils.decorators import activity from schema import Schema, Literal class StockPriceTool(BaseTool): @activity( config={"description": "Get current stock price"}, schema=Schema({Literal("symbol"): str}) ) def get_price(self, params): symbol = params["values"]["symbol"] price = fetch_stock_price(symbol) return TextArtifact(f"{symbol}: ${price}") ``` ### Memory & Conversation ```python from griptape.structures import Agent from griptape.memory.structure import ConversationMemory from griptape.memory.task.storage import LocalDiskTaskMemoryDriver # Persistent conversation memory agent = Agent( conversation_memory=ConversationMemory( driver=LocalDiskTaskMemoryDriver(file_path="memory.json") ) ) agent.run("My name is Alice and I work on ML pipelines") # Later... agent.run("What do you know about me?") # -> "You're Alice, you work on ML pipelines" ``` ### Multi-LLM Support ```python from griptape.drivers import AnthropicPromptDriver, OpenAiChatPromptDriver # Use Claude for reasoning agent = Agent( prompt_driver=AnthropicPromptDriver(model="claude-sonnet-4-6") ) # Or OpenAI agent = Agent( prompt_driver=OpenAiChatPromptDriver(model="gpt-4o") ) # Or local via Ollama from griptape.drivers import OllamaPromptDriver agent = Agent( prompt_driver=OllamaPromptDriver(model="llama3") ) ``` --- ## FAQ **Q: What is Griptape?** A: Griptape is a modular Python framework with 2,500+ GitHub stars for building AI agents with composable structures (Agent, Pipeline, Workflow), pluggable tools, and Off-Prompt data processing for secure enterprise use. **Q: What makes Griptape different from LangChain or CrewAI?** A: Griptape's key differentiator is Off-Prompt processing — sensitive data is processed locally and only summaries reach the LLM. It also has a cleaner modular architecture with three distinct structures, versus LangChain's sprawling API surface. CrewAI focuses on multi-agent roles; Griptape focuses on secure, composable pipelines. **Q: Is Griptape free?** A: Yes, the framework is open-source under Apache-2.0. Griptape Cloud (managed hosting) is a paid service with a free tier. --- ## Source & Thanks > Created by [Griptape](https://github.com/griptape-ai). Licensed under Apache-2.0. > > [griptape](https://github.com/griptape-ai/griptape) — ⭐ 2,500+ Thanks to the Griptape team for bringing enterprise-grade security patterns to AI agent development. --- ## 快速使用 ```bash pip install griptape[all] ``` ```python from griptape.structures import Agent from griptape.tools import WebScraper, Calculator, FileManager agent = Agent(tools=[WebScraper(), Calculator(), FileManager()]) agent.run("抓取 Hacker News 首页,计算前5条的平均分数,保存到 results.txt") ``` --- ## 简介 Griptape 是一个拥有 2,500+ GitHub stars 的模块化 Python AI 代理框架。其核心创新是 **Off-Prompt** 模式 — 敏感数据在 LLM 上下文窗口外处理,确保企业级数据安全。提供三种可组合结构(Agent、Pipeline、Workflow),搭配可插拔的工具、记忆模块和任意 LLM 后端。 适用于:OpenAI、Anthropic Claude、Gemini、AWS Bedrock、Ollama。适合需要安全处理敏感数据的企业 AI 代理开发团队。 --- ## 核心特性 ### 三种结构 - **Agent** — 单任务代理,配备工具 - **Pipeline** — 顺序多步任务 - **Workflow** — 并行+顺序 DAG 执行 ### Off-Prompt 数据处理 敏感数据在本地处理,只有摘要发送给 LLM,确保数据安全。 ### 丰富的工具生态 Web 抓取、数据库查询、文件管理、S3、邮件、日历等预置工具。 ### 多 LLM 支持 OpenAI、Anthropic、Gemini、Bedrock、Ollama 等主流提供商。 --- ## 来源与感谢 > Created by [Griptape](https://github.com/griptape-ai). Licensed under Apache-2.0. > > [griptape](https://github.com/griptape-ai/griptape) — ⭐ 2,500+ --- Source: https://tokrepo.com/en/workflows/43de1956-a080-47df-bf44-087257ebf252 Author: TokRepo精选