Continue

Continue

The leading open-source AI code assistant

Continue is the leading open-source AI code assistant that works inside VS Code and JetBrains IDEs. Unlike proprietary tools, Continue lets you bring your own models (BYOK). You can connect to local LLMs via Ollama, or use API keys for GPT-4, Claude 3.5, and DeepSeek. It gives you full control over your data privacy and the AI stack you use.

Category

Extension

Rating

4.7

Users

400K+

Platform Support

macOSWindowsLinux

Pricing Plans

Free Plan

$0

Free & Open Source (Apache 2.0)

Recommended

Pro Plan

Free /forever

Individual: Full features, bring your own API key or run local models

Recommended

Pro Plan

Custom /user

Enterprise: Hub for centralized configuration, analytics, and governance

Enterprise Plan

定制

Continue for Teams: SSO, audit logs, standardized config

Core Features

1 Core Features

Model Agnostic

Use any model you want: GPT-4o, Claude 3.5 Sonnet, DeepSeek, or local Llama 3

Context Providers

Type @ to reference files, docs, terminal output, or git diffs in your chat

Local AI Support

First-class support for Ollama, LM Studio, and other local inference servers

Tab Autocomplete

Predictive code completion using low-latency models like StarCoder2 or DeepSeek-Coder

2 Flexibility

Customizability

Configure everything via config.json: custom prompts, slash commands, and model parameters

Cross-IDE

Consistent experience across VS Code and JetBrains (IntelliJ, PyCharm, etc.)

3 Developer Experience

Edit Mode

Highlight code and press Cmd+I to refactor or fix bugs inline

Docs Indexing

Index documentation sites to ask questions about third-party libraries

Data Privacy

Your code never leaves your machine unless you use a cloud API; full telemetry control

Pros and Cons

Advantages

  • Completely open-source and free to use
  • Unmatched flexibility: swap models instantly (e.g., use DeepSeek for chat, StarCoder for autocomplete)
  • Best-in-class support for Local LLMs (Ollama)
  • Powerful context system (@docs, @codebase, @git)
  • No vendor lock-in; you own your configuration

Disadvantages

  • Requires more initial setup (configuration) than Copilot
  • Autocomplete latency depends on your chosen provider/hardware
  • UI is functional but less "native" feeling than Cursor
  • Indexing for @codebase can be slower on very large repos compared to specialized vector DBs

Best For

Privacy-conscious developers who want to run local models

Power users who want to mix and match different AI models

Teams that need a self-hosted AI coding solution

Developers using JetBrains IDEs who want a better chat experience

Tech Stack

Architecture

IDE Extension (VS Code & JetBrains)

AI Models

Any (Ollama, OpenAI, Anthropic, DeepSeek, Mistral)

Platforms

macOSWindowsLinux

Community Resources