# STORM — AI Research Report Generator by Stanford > Stanford's LLM-powered system that researches any topic and writes a full Wikipedia-style article with citations. Simulates multi-perspective expert conversations. ## Install Copy the content below into your project: # STORM — AI Research Report Generator by Stanford ## Quick Use ```bash pip install knowledge-storm # Set API keys export OPENAI_API_KEY=sk-... export YDC_API_KEY=... # You.com search API ``` ```python from knowledge_storm import STORMWikiRunnerArguments, STORMWikiRunner, STORMWikiLMConfigs from knowledge_storm.lm import OpenAIModel # Configure LLMs lm_configs = STORMWikiLMConfigs() lm_configs.set_conv_simulator_lm(OpenAIModel(model="gpt-4o-mini")) lm_configs.set_article_gen_lm(OpenAIModel(model="gpt-4o")) # Configure and run args = STORMWikiRunnerArguments(output_dir="./results") runner = STORMWikiRunner(args, lm_configs) # Generate a research report runner.run( topic="Impact of large language models on scientific research", do_research=True, do_generate_outline=True, do_generate_article=True, do_polish_article=True, ) runner.post_run() # Saves article, outline, references ``` ## Intro STORM (Synthesis of Topic Outlines through Retrieval and Multi-perspective question asking) is a research system by Stanford OVAL that writes Wikipedia-quality articles on any topic. It doesn't just summarize — it simulates conversations between multiple AI "experts" with different perspectives, then synthesizes their insights into a structured, cited article. The process mirrors how human researchers work: discover perspectives, ask deep questions from each angle, gather sources, outline, write, and polish. Output includes a full article with inline citations, a reference list, and a conversation log showing the research process. With 28,000+ stars, STORM is the leading open-source tool for AI-powered research synthesis. ## Details ### How STORM Works ``` 1. PERSPECTIVE DISCOVERY → Given topic "quantum computing applications" → Identifies 5-8 expert perspectives: - Physicist, Computer Scientist, Industry Analyst, Ethics Researcher, Quantum Engineer... 2. MULTI-PERSPECTIVE RESEARCH → Each "expert" asks questions from their angle → Questions are answered using web search (You.com, Bing) → Simulated conversations deepen understanding 3. OUTLINE GENERATION → Synthesizes all research into a structured outline → Identifies key sections, subsections, themes 4. ARTICLE WRITING → Writes each section using gathered evidence → Adds inline citations [1], [2], [3] → Maintains coherent narrative across sections 5. POLISH → Removes redundancy, improves flow → Verifies citation accuracy → Outputs final article + references ``` ### Output Structure - `storm_gen_article.txt` — Full article (2,000-5,000 words) - `storm_gen_outline.txt` — Structured outline - `url_to_info.json` — All source URLs with extracted content - `conversation_log.json` — Full simulated expert conversations ### Supported LLM Backends - OpenAI (GPT-4o, GPT-4o-mini) - Anthropic (Claude) - Ollama (local models) - Any LiteLLM-compatible provider ### Co-STORM: Collaborative Mode Co-STORM lets you participate in the research process interactively. The AI researches while you steer the direction, ask follow-up questions, and refine the scope in real-time. ## Frequently Asked Questions **Q: How long does a full research report take?** A: 3-10 minutes depending on topic complexity and LLM speed. Most time is spent on web search and multi-perspective conversations. **Q: How accurate are the citations?** A: Citations link to real web sources. STORM extracts and verifies information from each source. However, always verify critical claims — it's AI-assisted research, not a replacement for peer review. **Q: Can I use it for academic papers?** A: It's excellent for literature reviews, background research, and first drafts. The output needs human review and editing for publication-quality work. **Q: Does it work offline?** A: Partially. With Ollama for LLM and cached sources, the generation works offline. But the research phase needs internet for web search. ## Works With - OpenAI, Anthropic, Ollama, any LiteLLM provider - You.com Search API, Bing Search API - Python 3.11+ - Any topic — technical, historical, scientific, current events ## Source & Thanks - **GitHub**: [stanford-oval/storm](https://github.com/stanford-oval/storm) — 28,000+ stars, MIT License - **Paper**: [Assisting in Writing Wikipedia-like Articles From Scratch with Large Language Models](https://arxiv.org/abs/2402.14207) - By Stanford OVAL (Open Virtual Assistant Lab) --- # STORM — 斯坦福 AI 研究报告生成器 ## 快速使用 ```bash pip install knowledge-storm ``` ```python from knowledge_storm import STORMWikiRunner runner = STORMWikiRunner(args, lm_configs) runner.run(topic="你的研究主题", do_research=True, do_generate_article=True) ``` ## 介绍 斯坦福 OVAL 实验室的 AI 研究系统。给定任何主题,模拟多视角专家对话进行深度研究,然后生成带引用的维基百科级文章。28,000+ stars,AI 驱动研究综合的标杆项目。 ## 来源与感谢 - **GitHub**: [stanford-oval/storm](https://github.com/stanford-oval/storm) — 28,000+ stars, MIT 许可证 - **论文**: arxiv.org/abs/2402.14207 --- Source: https://tokrepo.com/en/workflows/f09e9348-2bdc-45aa-9907-cfb30290f69b Author: Skill Factory