PromptsApr 1, 2026·1 min read

Fabric — 100+ AI Prompt Patterns for Everything

Fabric organizes 100+ AI prompt patterns for real-world tasks. 40.3K+ GitHub stars. 20+ providers, CLI + REST API, custom patterns. MIT.

TL;DR
CLI tool with 100+ curated AI prompt patterns for summarizing, coding, writing, and analysis tasks across 20+ providers.
§01

What it is

Fabric is a CLI tool that organizes 100+ AI prompt patterns for real-world tasks like summarizing articles, explaining code, extracting wisdom from podcasts, and writing essays. Each pattern is a carefully crafted system prompt that turns a general LLM into a specialized tool for a specific task. Fabric supports 20+ LLM providers and works via both CLI and REST API.

The project targets anyone who uses AI regularly and wants consistent, high-quality outputs without reinventing prompts. Developers, researchers, writers, and analysts all benefit from the curated pattern library. The project has 40.3K+ GitHub stars.

§02

How it saves time or tokens

Fabric eliminates the prompt engineering cycle for common tasks. Instead of spending 10 minutes crafting a summarization prompt, you pipe text through fabric --pattern summarize and get a structured result immediately. Patterns are optimized for output quality, meaning fewer retry attempts and less wasted tokens on poorly structured prompts. Custom patterns let you build on the library for domain-specific tasks.

§03

How to use

  1. Install Fabric:
brew install fabric-ai
# Or: curl -fsSL https://github.com/danielmiessler/fabric/releases/latest/install.sh | bash
  1. Configure your AI provider:
fabric --setup
  1. Use a pattern:
echo 'Long article text...' | fabric --pattern summarize
cat code.py | fabric --pattern explain_code
fabric --pattern write_essay 'AI in 2026'
§04

Example

Extract key insights from a YouTube transcript:

# Download transcript and extract wisdom
yt --transcript 'https://youtube.com/watch?v=...' | fabric --pattern extract_wisdom

# Output includes:
# - SUMMARY: One-sentence overview
# - IDEAS: Key concepts discussed
# - INSIGHTS: Non-obvious takeaways
# - QUOTES: Notable statements
# - RECOMMENDATIONS: Actionable advice

Create a custom pattern for your specific domain:

# Create a custom pattern directory
mkdir -p ~/.config/fabric/patterns/review_pr
# Add system.md with your prompt template
§05

Related on TokRepo

§06

Common pitfalls

  • Fabric patterns are system prompts, not magic. Output quality still depends on the underlying LLM capability. Complex patterns may underperform on smaller models.
  • The --setup wizard stores API keys locally. Ensure your key file has restricted permissions on shared machines.
  • Custom patterns require following the exact directory structure. Missing the system.md file in your pattern directory causes silent fallback to default behavior.
  • Always check the official documentation for the latest version-specific changes and migration guides before upgrading in production environments.

Frequently Asked Questions

How many prompt patterns does Fabric include?+

Fabric includes 100+ curated patterns covering summarization, code explanation, writing, analysis, extraction, and many other tasks. The library is actively maintained and new patterns are added regularly by the community.

Which AI providers does Fabric support?+

Fabric supports 20+ providers including OpenAI, Anthropic, Google, Ollama for local models, and others. The setup wizard lets you configure your preferred provider with API key authentication.

Can I create custom patterns?+

Yes. Custom patterns are directories containing a system.md file with your prompt template. Place them in ~/.config/fabric/patterns/ and they become available alongside built-in patterns via the CLI.

Does Fabric work with local LLMs?+

Yes. Fabric supports Ollama for local model inference. This means you can use all patterns with locally running models for privacy-sensitive tasks or offline use.

What is the difference between Fabric and a prompt library?+

Fabric is an executable tool, not just a collection of prompts. It handles piping input, selecting providers, formatting output, and chaining patterns. A prompt library gives you templates to copy; Fabric gives you a command-line workflow.

Citations (3)
🙏

Source & Thanks

danielmiessler/fabric — 40,300+ GitHub stars

Discussion

Sign in to join the discussion.
No comments yet. Be the first to share your thoughts.