PromptsApr 6, 2026·2 min read

Fabric — AI Automation Patterns & Prompt Library

Curated collection of 100+ reusable AI prompt patterns for summarizing, extracting wisdom, writing, and coding. Run any pattern from CLI with one command. 30,000+ GitHub stars.

TL;DR
Fabric provides 100+ reusable AI prompt patterns you run from the CLI to summarize content, extract wisdom, write, and code.
§01

What it is

Fabric is a curated collection of reusable AI prompt patterns designed for common tasks like summarizing content, extracting key insights, writing, and coding. Each pattern is a well-crafted system prompt that you run from the command line with a single command. Fabric connects to your preferred LLM provider (OpenAI, Anthropic, Ollama, etc.) and pipes content through the selected pattern.

Developers, researchers, and knowledge workers who want to apply consistent AI-powered transformations to text, videos, and documents use Fabric as their CLI prompt toolkit. The pattern library covers content analysis, writing, coding, security, and personal productivity.

§02

How it saves time or tokens

Crafting effective system prompts for each task is time-consuming and inconsistent. Fabric provides pre-tested patterns that produce reliable results. The CLI pipeline approach lets you chain patterns with Unix pipes, processing content through multiple transformations in a single command. Running patterns locally with Ollama eliminates API costs entirely.

§03

How to use

  1. Install Fabric:
go install github.com/danielmiessler/fabric@latest
  1. Set up your LLM provider:
fabric --setup  # configure API key for OpenAI, Anthropic, or Ollama
  1. Run patterns on content:
# Extract insights from a YouTube video
yt --transcript https://youtube.com/watch?v=xyz | fabric --pattern extract_wisdom

# Summarize an article
cat article.md | fabric --pattern summarize

# Analyze code for security issues
cat main.go | fabric --pattern analyze_code_security
§04

Example

# Chain multiple patterns with Unix pipes
curl -s https://example.com/blog-post | \
  fabric --pattern extract_article_content | \
  fabric --pattern summarize

# Use with local Ollama models (no API costs)
fabric --pattern write_essay --model ollama/llama3 < topic.txt

# List available patterns
fabric --list

# Create a custom pattern
mkdir -p ~/.config/fabric/patterns/my_pattern
cat > ~/.config/fabric/patterns/my_pattern/system.md << 'EOF'
# IDENTITY and PURPOSE
You are an expert at analyzing technical documentation.

# OUTPUT
Provide a structured summary with: key concepts, prerequisites, and gotchas.
EOF

cat docs.md | fabric --pattern my_pattern
§05

Related on TokRepo

§06

Common pitfalls

  • Fabric patterns are opinionated. Each pattern has a specific output format. Read the system.md file for a pattern before using it to understand what output structure to expect.
  • The yt command for YouTube transcripts requires yt-dlp installed separately. Without it, YouTube-related patterns fail at the transcript extraction step.
  • Custom patterns must follow the directory structure: ~/.config/fabric/patterns/<name>/system.md. Missing the system.md file causes the pattern to not appear in --list.

Frequently Asked Questions

What are Fabric patterns?+

Fabric patterns are curated system prompts designed for specific tasks. Each pattern is a markdown file (system.md) that instructs the LLM on its role, input format, and output structure. Patterns cover summarizing, extracting wisdom, writing, coding, security analysis, and more.

What LLM providers does Fabric support?+

Fabric supports OpenAI, Anthropic Claude, Google Gemini, Ollama (local models), and any OpenAI-compatible API. Configure your preferred provider during fabric --setup. You can switch providers per-command using the --model flag.

Can I create custom Fabric patterns?+

Yes. Create a directory under ~/.config/fabric/patterns/ with a system.md file containing your system prompt. The pattern immediately becomes available via fabric --pattern your_pattern_name. Follow existing patterns as templates for structure.

How does Fabric work with YouTube videos?+

Fabric includes a yt helper that extracts transcripts from YouTube videos using yt-dlp. Pipe the transcript into any pattern: yt --transcript URL | fabric --pattern extract_wisdom. This works for summarizing talks, extracting key points, or creating study notes.

Can I run Fabric with local models?+

Yes. Configure Ollama as your provider during setup, then specify local models with --model ollama/llama3 or similar. This runs all processing locally with no API costs and no data leaving your machine.

Citations (3)
🙏

Source & Thanks

Created by Daniel Miessler. Licensed under MIT.

fabric — ⭐ 30,000+

Thanks to Daniel Miessler for making AI prompt engineering accessible through reusable patterns.

Discussion

Sign in to join the discussion.
No comments yet. Be the first to share your thoughts.