ScriptsMar 29, 2026·2 min read

Continue — Open-Source AI Code Assistant

Open-source AI code assistant for VS Code and JetBrains. Tab autocomplete, chat, inline editing with any model — OpenAI, Anthropic, Ollama, or self-hosted.

TL;DR
Continue brings AI tab autocomplete, chat, and inline editing to VS Code and JetBrains with any model provider.
§01

What it is

Continue is an open-source AI code assistant that integrates into VS Code and JetBrains IDEs. It provides tab autocomplete, chat, and inline editing powered by any LLM provider -- OpenAI, Anthropic, Ollama, or your own self-hosted endpoint.

Continue is for developers who want AI coding assistance with full control over which model they use and where their code is sent. Unlike closed-source alternatives, Continue lets you swap providers, run local models for privacy, or use your company's internal API endpoint.

§02

How it saves time or tokens

Continue reduces context-switching between your IDE and external chat interfaces. Instead of copying code into a browser-based LLM, you get inline suggestions and edits directly in your editor.

With model flexibility, you can route different tasks to different models. Use a fast local model for tab autocomplete (low latency matters) and a larger cloud model for complex refactoring or code generation. This optimizes both cost and speed.

§03

How to use

  1. Install Continue from the VS Code Marketplace or JetBrains Marketplace -- search for 'Continue'.
  1. Configure your model provider in ~/.continue/config.json:
{
  "models": [
    {
      "title": "Claude Sonnet",
      "provider": "anthropic",
      "model": "claude-sonnet-4-20250514",
      "apiKey": "sk-ant-..."
    },
    {
      "title": "Ollama Local",
      "provider": "ollama",
      "model": "codellama:13b"
    }
  ],
  "tabAutocompleteModel": {
    "title": "Starcoder",
    "provider": "ollama",
    "model": "starcoder2:3b"
  }
}
  1. Use Cmd+I (or Ctrl+I) for inline editing, Cmd+L for chat, and Tab for autocomplete.
§04

Example

Using Continue's inline edit to refactor a function:

# Select this function, press Cmd+I, type 'add error handling and type hints'
def fetch_user(user_id):
    response = requests.get(f'/api/users/{user_id}')
    return response.json()

# Continue generates:
def fetch_user(user_id: int) -> dict:
    try:
        response = requests.get(f'/api/users/{user_id}')
        response.raise_for_status()
        return response.json()
    except requests.RequestException as e:
        raise UserFetchError(f'Failed to fetch user {user_id}') from e
§05

Related on TokRepo

§06

Common pitfalls

  • Tab autocomplete latency depends on your model choice. Cloud models add network round-trip time. For the best autocomplete experience, use a local model (Ollama + Starcoder or CodeLlama).
  • Continue's context window is limited by the model you choose. Large files may exceed the context, producing incomplete suggestions. Use the @file and @folder context providers to scope what Continue sees.
  • Configuration is per-user in ~/.continue/config.json. For team-wide settings, use the .continue/ directory in your project root.

Frequently Asked Questions

Is Continue free to use?+

Continue itself is free and open source under the Apache 2.0 license. You pay for the LLM provider you choose -- OpenAI, Anthropic, or others. If you use a local model via Ollama, there are no API costs at all.

What IDEs does Continue support?+

Continue supports VS Code (and VS Code forks like Cursor and Windsurf) and JetBrains IDEs (IntelliJ, PyCharm, WebStorm, GoLand, and others). Install it from the respective marketplace.

Can I use Continue with a local model?+

Yes. Continue integrates with Ollama, LM Studio, and llama.cpp for local inference. This keeps your code on your machine and eliminates API costs. Local models work for autocomplete, chat, and inline editing.

How does Continue compare to GitHub Copilot?+

Continue is open source and model-agnostic. You choose your provider and can run local models. Copilot is a proprietary service tied to GitHub and OpenAI. Continue offers more flexibility at the cost of self-managing your model configuration.

Does Continue support custom context providers?+

Yes. Continue supports context providers like @file, @folder, @codebase, @docs, and @web. You can also build custom context providers to pull data from your internal tools, wikis, or databases into the AI context.

Citations (3)
🙏

Source & Thanks

Created by Continue. Licensed under Apache 2.0. continuedev/continue — 22K+ GitHub stars

Discussion

Sign in to join the discussion.
No comments yet. Be the first to share your thoughts.

Related Assets