SkillsApr 6, 2026·2 min read

Continue — Open-Source AI Code Assistant for IDEs

Open-source AI code assistant for VS Code and JetBrains. Connect any LLM model, use autocomplete, chat, and inline edits. Fully customizable with your own models and context. 22,000+ stars.

SK
Skill Factory · Community
Quick Use

Use it first, then decide how deep to go

This block should tell both the user and the agent what to copy, install, and apply first.

  1. Install from VS Code Marketplace: search "Continue" or JetBrains Marketplace
  2. Open Continue sidebar (Ctrl+L / Cmd+L)
  3. Configure your model in ~/.continue/config.json:
{
  "models": [
    {
      "title": "Claude Sonnet",
      "provider": "anthropic",
      "model": "claude-sonnet-4-20250514",
      "apiKey": "sk-ant-..."
    }
  ]
}
  1. Start chatting, use autocomplete, or select code and press Ctrl+I for inline edits.

Intro

Continue is an open-source AI code assistant for VS Code and JetBrains IDEs with 22,000+ GitHub stars. Unlike proprietary tools, Continue lets you connect any LLM — Claude, GPT-4, Gemini, Llama, Mistral, or local models via Ollama — and customize every aspect of the experience. Features include chat, autocomplete, inline editing, and context providers that pull from your docs, codebase, and tools. Best for developers who want full control over their AI coding assistant without vendor lock-in. Works with: VS Code, JetBrains (IntelliJ, PyCharm, WebStorm). Setup time: under 3 minutes.


Core Features

Chat with Code Context

Select code and ask questions with full codebase awareness:

Ctrl+L → Opens chat sidebar
Select code"Explain this function"
@ mention files → "@auth.ts how does this relate to the login flow?"

Autocomplete

Tab-complete with any model:

{
  "tabAutocompleteModel": {
    "title": "Codestral",
    "provider": "mistral",
    "model": "codestral-latest"
  }
}

Inline Editing

Select code and describe changes:

Ctrl+I"Add error handling and input validation"
Continue generates the edit, you accept or reject

Context Providers

Pull context from multiple sources:

Provider What It Does
@file Reference specific files
@codebase Search the entire codebase
@docs Query indexed documentation
@terminal Include terminal output
@url Fetch and include web content
@git Git diff and history context

Model Flexibility

Use any model from any provider:

{
  "models": [
    {"title": "Claude", "provider": "anthropic", "model": "claude-sonnet-4-20250514"},
    {"title": "GPT-4", "provider": "openai", "model": "gpt-4o"},
    {"title": "Local Llama", "provider": "ollama", "model": "llama3.1"}
  ]
}

Custom Slash Commands

{
  "customCommands": [
    {
      "name": "test",
      "prompt": "Write comprehensive unit tests for the selected code. Use the existing test framework."
    }
  ]
}

Key Stats

  • 22,000+ GitHub stars
  • VS Code + JetBrains support
  • 20+ model providers
  • Fully customizable config
  • Active community with 500+ contributors

FAQ

Q: What is Continue? A: Continue is an open-source AI code assistant for VS Code and JetBrains that lets you connect any LLM for chat, autocomplete, and inline editing with full customization.

Q: Is Continue free? A: Yes, fully open-source under Apache 2.0 license. You provide your own model API keys.

Q: How is Continue different from GitHub Copilot? A: Continue is open-source, supports any LLM provider (not just OpenAI), and is fully customizable — you control the models, context, and commands.


🙏

Source & Thanks

Created by Continue. Licensed under Apache 2.0.

continue — ⭐ 22,000+

Thanks to the Continue team for making AI code assistance open and customizable.

Discussion

Sign in to join the discussion.
No comments yet. Be the first to share your thoughts.

Related Assets