OpenDeepWiki — How It Works
Processing Pipeline
- Clone — Clones the target repository
- Scan — Reads
.gitignore, scans directory structure - Filter — Intelligent filtering for relevant files
- Generate README — Creates or improves project README
- Classify — Generates project type classifications
- Catalog — Creates structured document catalog
- Process — Generates detailed documentation pages
- Visualize — Creates Mermaid diagrams for code structure
Key Features
- Multi-Platform: GitHub, GitLab, Gitea, Gitee, AtomGit, Bitbucket
- Multi-Language: Analyzes code in any programming language
- Mermaid Diagrams: Auto-generated architecture and flow diagrams
- Conversational AI: Chat interface powered by RAG for code questions
- SEO-Friendly: Next.js frontend with proper meta tags
- MCP Support: Connect as MCP server to AI tools
- Multi-DB: SQLite, PostgreSQL, SQL Server, MySQL
Tech Stack
| Layer | Technology |
|---|---|
| Backend | C#, .NET 9, Semantic Kernel |
| Frontend | TypeScript, Next.js |
| AI Providers | OpenAI, Azure OpenAI, Anthropic |
| Databases | SQLite, PostgreSQL, SQL Server, MySQL |
| Deployment | Docker, Kubernetes (Sealos) |
FAQ
Q: What is OpenDeepWiki? A: A self-hosted tool that converts any Git repository into an AI-powered wiki with auto-generated documentation, Mermaid diagrams, and a conversational AI interface.
Q: Is it free? A: Yes, MIT license. You provide your own LLM API keys (OpenAI, Azure, or Anthropic).
Q: How long does it take to process a repo? A: Small repos: 1-2 minutes. Large repos: 5-15 minutes depending on size and LLM speed.