Continue — Open-Source AI Code Assistant
Open-source AI code assistant for VS Code and JetBrains. Tab autocomplete, chat, inline editing with any model — OpenAI, Anthropic, Ollama, or self-hosted.
What it is
Continue is an open-source AI code assistant that integrates into VS Code and JetBrains IDEs. It provides tab autocomplete, chat, and inline editing powered by any LLM provider -- OpenAI, Anthropic, Ollama, or your own self-hosted endpoint.
Continue is for developers who want AI coding assistance with full control over which model they use and where their code is sent. Unlike closed-source alternatives, Continue lets you swap providers, run local models for privacy, or use your company's internal API endpoint.
How it saves time or tokens
Continue reduces context-switching between your IDE and external chat interfaces. Instead of copying code into a browser-based LLM, you get inline suggestions and edits directly in your editor.
With model flexibility, you can route different tasks to different models. Use a fast local model for tab autocomplete (low latency matters) and a larger cloud model for complex refactoring or code generation. This optimizes both cost and speed.
How to use
- Install Continue from the VS Code Marketplace or JetBrains Marketplace -- search for 'Continue'.
- Configure your model provider in
~/.continue/config.json:
{
"models": [
{
"title": "Claude Sonnet",
"provider": "anthropic",
"model": "claude-sonnet-4-20250514",
"apiKey": "sk-ant-..."
},
{
"title": "Ollama Local",
"provider": "ollama",
"model": "codellama:13b"
}
],
"tabAutocompleteModel": {
"title": "Starcoder",
"provider": "ollama",
"model": "starcoder2:3b"
}
}
- Use
Cmd+I(orCtrl+I) for inline editing,Cmd+Lfor chat, and Tab for autocomplete.
Example
Using Continue's inline edit to refactor a function:
# Select this function, press Cmd+I, type 'add error handling and type hints'
def fetch_user(user_id):
response = requests.get(f'/api/users/{user_id}')
return response.json()
# Continue generates:
def fetch_user(user_id: int) -> dict:
try:
response = requests.get(f'/api/users/{user_id}')
response.raise_for_status()
return response.json()
except requests.RequestException as e:
raise UserFetchError(f'Failed to fetch user {user_id}') from e
Related on TokRepo
- Coding AI tools -- more AI-powered code assistants
- Local LLM tools -- run models locally for privacy
Common pitfalls
- Tab autocomplete latency depends on your model choice. Cloud models add network round-trip time. For the best autocomplete experience, use a local model (Ollama + Starcoder or CodeLlama).
- Continue's context window is limited by the model you choose. Large files may exceed the context, producing incomplete suggestions. Use the @file and @folder context providers to scope what Continue sees.
- Configuration is per-user in
~/.continue/config.json. For team-wide settings, use the.continue/directory in your project root.
Frequently Asked Questions
Continue itself is free and open source under the Apache 2.0 license. You pay for the LLM provider you choose -- OpenAI, Anthropic, or others. If you use a local model via Ollama, there are no API costs at all.
Continue supports VS Code (and VS Code forks like Cursor and Windsurf) and JetBrains IDEs (IntelliJ, PyCharm, WebStorm, GoLand, and others). Install it from the respective marketplace.
Yes. Continue integrates with Ollama, LM Studio, and llama.cpp for local inference. This keeps your code on your machine and eliminates API costs. Local models work for autocomplete, chat, and inline editing.
Continue is open source and model-agnostic. You choose your provider and can run local models. Copilot is a proprietary service tied to GitHub and OpenAI. Continue offers more flexibility at the cost of self-managing your model configuration.
Yes. Continue supports context providers like @file, @folder, @codebase, @docs, and @web. You can also build custom context providers to pull data from your internal tools, wikis, or databases into the AI context.
Citations (3)
- Continue GitHub— Continue open-source AI code assistant
- Continue Documentation— Configuration and context providers
- Anthropic API Docs— Anthropic Claude API for code assistance
Related on TokRepo
Source & Thanks
Created by Continue. Licensed under Apache 2.0. continuedev/continue — 22K+ GitHub stars
Discussion
Related Assets
Moodle — Open-Source Learning Management System
The most widely used open-source learning platform, providing course management, assessments, and collaboration tools for educators and organizations worldwide.
Sylius — Headless E-Commerce Framework on Symfony
An open-source headless e-commerce platform built on Symfony and API Platform, designed for developers who need a customizable and API-first commerce solution.
Akaunting — Free Self-Hosted Accounting Software
A free, open-source online accounting application built on Laravel for small businesses and freelancers to manage invoices, expenses, and financial reports.