Morphic — Open-Source AI Answer Engine
Perplexity-style AI search with generative UI. Multiple LLM and search providers. Self-hostable with Next.js. 8.7K+ stars.
What it is
Morphic is an open-source AI answer engine in the style of Perplexity AI. It searches the web, synthesizes information from multiple sources, and presents answers with citations and a generative UI. Morphic supports multiple LLM providers (OpenAI, Anthropic, Google) and search backends (Tavily, SearXNG). Built with Next.js, it is fully self-hostable.
Morphic is designed for developers who want to build or self-host their own AI search experience without relying on proprietary platforms.
How it saves time or tokens
Building an AI search engine from scratch requires integrating web search APIs, implementing citation extraction, building a streaming UI, and handling multiple LLM providers. Morphic provides all of this as a ready-to-deploy Next.js application. Clone the repo, set your API keys, and you have a working AI search engine. The generative UI streams answers with inline citations, providing a polished user experience out of the box.
How to use
- Clone and start with Docker:
git clone https://github.com/miurla/morphic.git
cd morphic
docker compose up -d
- Or run locally for development:
git clone https://github.com/miurla/morphic.git
cd morphic
cp .env.example .env.local
# Edit .env.local with your API keys
npm install
npm run dev
- Open
http://localhost:3000and start searching.
Example
Configuration for different LLM and search providers:
# .env.local
# LLM Provider (choose one)
OPENAI_API_KEY=sk-...
# or
ANTHROPIC_API_KEY=sk-ant-...
# or
GOOGLE_GENERATIVE_AI_API_KEY=...
# Search Provider
TAVILY_API_KEY=tvly-...
# or use SearXNG for fully self-hosted search
SEARXNG_API_URL=http://localhost:8888
# Optional
UPSTASH_REDIS_REST_URL=... # For rate limiting
AUTH_SECRET=... # For authentication
Morphic uses the configured provider to search the web, retrieve relevant pages, and generate a synthesized answer with source citations.
Related on TokRepo
- Research tools — Browse AI research and search tools
- Self-hosted tools — Explore self-hosted AI applications
Common pitfalls
- Not configuring a search provider. Morphic requires either Tavily or SearXNG for web search. Without a search backend, it can only generate answers from the LLM's training data, not live web results.
- Running without Redis in production. Redis (via Upstash) handles rate limiting and caching. Without it, your instance is vulnerable to abuse and makes redundant API calls.
- Expecting identical results to Perplexity. Morphic is an open-source alternative, not a clone. Answer quality depends on your chosen LLM and search provider combination.
- Starting with an overly complex configuration instead of defaults. Begin with the minimal setup, verify it works, then customize incrementally. This approach catches configuration errors early and keeps troubleshooting straightforward.
Frequently Asked Questions
Morphic provides similar functionality: web search, answer synthesis, and citations. The key difference is that Morphic is open source and self-hostable. You control the data, choose your LLM provider, and can customize the UI. Perplexity is a polished commercial product with proprietary search optimizations.
Yes, if your local LLM provides an OpenAI-compatible API (like Ollama or vLLM). Point the OpenAI API base URL to your local server. Quality depends on the local model's capabilities.
Morphic itself is free and open source. You pay for the LLM API calls (OpenAI, Anthropic) and search API calls (Tavily). Using SearXNG for search and a local LLM makes it completely free to operate.
Yes. Morphic streams answers token by token with a generative UI that renders citations and source cards as the answer is generated. This provides a responsive user experience.
Yes. Morphic is a standard Next.js application. You can modify components, styles, and layouts by editing the React components. The codebase uses Tailwind CSS for styling.
Citations (3)
- Morphic GitHub— Morphic is an open-source AI answer engine
- Morphic README— Built with Next.js and multiple LLM providers
- Tavily— Tavily search API for web search