SkillsApr 6, 2026·2 min read

Continue — Open-Source AI Code Assistant for IDEs

Open-source AI code assistant for VS Code and JetBrains. Connect any LLM model, use autocomplete, chat, and inline edits. Fully customizable with your own models and context. 22,000+ stars.

TL;DR
Open-source IDE extension connecting any LLM for autocomplete, chat, and inline edits with full customization.
§01

What it is

Continue is an open-source AI code assistant that runs inside VS Code and JetBrains IDEs. Unlike closed-source alternatives, Continue lets you bring your own model — OpenAI, Anthropic, local Ollama, or any OpenAI-compatible endpoint. It provides three core features: tab autocomplete, chat sidebar, and inline edit.

The tool targets developers who want AI assistance without vendor lock-in. You control which model runs, what context it sees, and how it behaves through a JSON configuration file.

§02

How it saves time or tokens

Continue uses context-aware autocomplete that indexes your codebase locally. This means suggestions are grounded in your actual code, not generic training data. The chat sidebar lets you ask questions about your codebase without leaving the IDE or copying code into a browser. Token usage stays efficient because Continue sends only relevant file context, not your entire project.

§03

How to use

  1. Install the Continue extension from the VS Code or JetBrains marketplace.
  2. Open the Continue config file and add your model provider credentials.
  3. Start coding — autocomplete activates on tab, chat opens in the sidebar.
// ~/.continue/config.json
{
  "models": [
    {
      "title": "Claude Sonnet",
      "provider": "anthropic",
      "model": "claude-sonnet-4-20250514",
      "apiKey": "sk-ant-..."
    }
  ],
  "tabAutocompleteModel": {
    "title": "Local Ollama",
    "provider": "ollama",
    "model": "starcoder2:3b"
  }
}
§04

Example

// In the chat sidebar, you can ask:
'Explain what the handleAuth middleware does in this project'

// Continue finds the relevant file, reads the function,
// and explains it in context of your codebase.

// For inline edits, select code and press Cmd+I:
'Refactor this function to use async/await instead of callbacks'
§05

Related on TokRepo

§06

Common pitfalls

  • Using a large model for tab autocomplete creates noticeable latency; use a small, fast model (3B parameters or less) for autocomplete and reserve larger models for chat.
  • The config.json file is per-user, not per-project; use workspace-level overrides if different projects need different models.
  • Context window limits apply — if you select too many files for the chat context, the model may truncate or ignore earlier files.

Frequently Asked Questions

Is Continue really free?+

Yes. Continue is open source under the Apache 2.0 license and the extension is free to install. You pay only for the LLM API calls you make. If you use a local model via Ollama, there are no API costs at all.

Which models work best with Continue?+

For chat and inline edits, Claude Sonnet or GPT-4o provide strong results. For tab autocomplete, smaller models like StarCoder2 3B or DeepSeek Coder 1.3B via Ollama give fast suggestions with low latency. You can mix different models for different features.

Does Continue support JetBrains IDEs?+

Yes. Continue has a JetBrains plugin that supports IntelliJ IDEA, PyCharm, WebStorm, and other JetBrains IDEs. The feature set is the same as the VS Code extension: autocomplete, chat, and inline edits.

Can Continue index my entire codebase?+

Continue builds a local index of your codebase for context retrieval. It uses embeddings to find relevant code when you ask questions in chat. The index stays on your machine and updates automatically as you edit files.

How does Continue compare to GitHub Copilot?+

Continue is open source and model-agnostic — you choose your own LLM provider. GitHub Copilot is a closed-source service tied to OpenAI models. Continue gives you more control over privacy, model selection, and configuration at the cost of requiring your own API keys.

Citations (3)
  • Continue GitHub— Continue is an open-source AI code assistant with 22,000+ GitHub stars
  • Continue Documentation— Continue supports VS Code and JetBrains IDEs with chat, autocomplete, and inline…
  • Ollama GitHub— Ollama runs LLMs locally with an OpenAI-compatible API
🙏

Source & Thanks

Created by Continue. Licensed under Apache 2.0.

continue — ⭐ 22,000+

Thanks to the Continue team for making AI code assistance open and customizable.

Discussion

Sign in to join the discussion.
No comments yet. Be the first to share your thoughts.

Related Assets