Fabric — AI Automation Patterns & Prompt Library
Curated collection of 100+ reusable AI prompt patterns for summarizing, extracting wisdom, writing, and coding. Run any pattern from CLI with one command. 30,000+ GitHub stars.
What it is
Fabric is a curated collection of reusable AI prompt patterns designed for common tasks like summarizing content, extracting key insights, writing, and coding. Each pattern is a well-crafted system prompt that you run from the command line with a single command. Fabric connects to your preferred LLM provider (OpenAI, Anthropic, Ollama, etc.) and pipes content through the selected pattern.
Developers, researchers, and knowledge workers who want to apply consistent AI-powered transformations to text, videos, and documents use Fabric as their CLI prompt toolkit. The pattern library covers content analysis, writing, coding, security, and personal productivity.
How it saves time or tokens
Crafting effective system prompts for each task is time-consuming and inconsistent. Fabric provides pre-tested patterns that produce reliable results. The CLI pipeline approach lets you chain patterns with Unix pipes, processing content through multiple transformations in a single command. Running patterns locally with Ollama eliminates API costs entirely.
How to use
- Install Fabric:
go install github.com/danielmiessler/fabric@latest
- Set up your LLM provider:
fabric --setup # configure API key for OpenAI, Anthropic, or Ollama
- Run patterns on content:
# Extract insights from a YouTube video
yt --transcript https://youtube.com/watch?v=xyz | fabric --pattern extract_wisdom
# Summarize an article
cat article.md | fabric --pattern summarize
# Analyze code for security issues
cat main.go | fabric --pattern analyze_code_security
Example
# Chain multiple patterns with Unix pipes
curl -s https://example.com/blog-post | \
fabric --pattern extract_article_content | \
fabric --pattern summarize
# Use with local Ollama models (no API costs)
fabric --pattern write_essay --model ollama/llama3 < topic.txt
# List available patterns
fabric --list
# Create a custom pattern
mkdir -p ~/.config/fabric/patterns/my_pattern
cat > ~/.config/fabric/patterns/my_pattern/system.md << 'EOF'
# IDENTITY and PURPOSE
You are an expert at analyzing technical documentation.
# OUTPUT
Provide a structured summary with: key concepts, prerequisites, and gotchas.
EOF
cat docs.md | fabric --pattern my_pattern
Related on TokRepo
- Prompt Library -- explore prompts and templates for AI tasks
- AI Tools for Automation -- discover automation tools and workflows
Common pitfalls
- Fabric patterns are opinionated. Each pattern has a specific output format. Read the system.md file for a pattern before using it to understand what output structure to expect.
- The yt command for YouTube transcripts requires yt-dlp installed separately. Without it, YouTube-related patterns fail at the transcript extraction step.
- Custom patterns must follow the directory structure: ~/.config/fabric/patterns/<name>/system.md. Missing the system.md file causes the pattern to not appear in --list.
Frequently Asked Questions
Fabric patterns are curated system prompts designed for specific tasks. Each pattern is a markdown file (system.md) that instructs the LLM on its role, input format, and output structure. Patterns cover summarizing, extracting wisdom, writing, coding, security analysis, and more.
Fabric supports OpenAI, Anthropic Claude, Google Gemini, Ollama (local models), and any OpenAI-compatible API. Configure your preferred provider during fabric --setup. You can switch providers per-command using the --model flag.
Yes. Create a directory under ~/.config/fabric/patterns/ with a system.md file containing your system prompt. The pattern immediately becomes available via fabric --pattern your_pattern_name. Follow existing patterns as templates for structure.
Fabric includes a yt helper that extracts transcripts from YouTube videos using yt-dlp. Pipe the transcript into any pattern: yt --transcript URL | fabric --pattern extract_wisdom. This works for summarizing talks, extracting key points, or creating study notes.
Yes. Configure Ollama as your provider during setup, then specify local models with --model ollama/llama3 or similar. This runs all processing locally with no API costs and no data leaving your machine.
Citations (3)
- Fabric GitHub— 100+ reusable AI prompt patterns for CLI automation
- Fabric README— Supports multiple LLM providers including local Ollama
- Fabric Documentation— Unix pipe-based pattern chaining for content processing
Related on TokRepo
Source & Thanks
Created by Daniel Miessler. Licensed under MIT.
fabric — ⭐ 30,000+
Thanks to Daniel Miessler for making AI prompt engineering accessible through reusable patterns.