ScriptsMar 29, 2026·1 min read

LlamaIndex — Data Framework for LLM Applications

Connect your data to large language models. The leading framework for RAG, document indexing, knowledge graphs, and structured data extraction.

TO
TokRepo精选 · Community
Quick Use

Use it first, then decide how deep to go

This block should tell both the user and the agent what to copy, install, and apply first.

pip install llama-index

Intro

LlamaIndex (formerly GPT Index) is the go-to framework for connecting custom data to LLMs. Ingest data from 160+ sources, build indexes for fast retrieval, and create production-ready RAG pipelines.

Best for: Document Q&A, knowledge base search, data extraction, enterprise RAG Works with: OpenAI, Anthropic, Google, Ollama, HuggingFace


Key Features

Data Connectors

Ingest from PDFs, databases, APIs, Notion, Slack, Google Drive, and 160+ sources:

from llama_index.core import SimpleDirectoryReader
documents = SimpleDirectoryReader("./data").load_data()

Indexing & Retrieval

from llama_index.core import VectorStoreIndex
index = VectorStoreIndex.from_documents(documents)
query_engine = index.as_query_engine()
response = query_engine.query("What is the revenue trend?")

Agents

Build data-aware agents that can query multiple data sources, use tools, and maintain conversation state.


🙏

Source & Thanks

Created by LlamaIndex. Licensed under MIT. run-llama/llama_index — 38K+ GitHub stars

Related Assets