# Mistral Codestral — 22B Open-Weight Coding Model > Codestral is Mistral's coding-specialized model. 22B params, 32K context, 80+ languages including Python, JS, Go, Rust, Bash. Free for non-commercial use. ## Install Copy the content below into your project: ## Quick Use 1. Local: `ollama pull codestral && ollama run codestral` 2. API: get a key at console.mistral.ai, then `export MISTRAL_API_KEY=...` 3. Plug into Continue / Aider / LiteLLM with the snippets below --- ## Intro Codestral is Mistral AI's coding-specialized model — 22B parameters, 32K context, trained on 80+ languages from Python and JavaScript to Bash and Fortran. The open-weight Codestral 22B is free for research and non-commercial use; the latest Codestral models are available via Mistral's API and Le Chat for production. Best for: local code completion via Continue / Aider, latency-sensitive code generation. Works with: Ollama, vLLM, llama.cpp, Continue, Aider, LiteLLM. Setup time: 5 minutes (Ollama pull or API key). --- ### Run Codestral locally with Ollama ```bash # Pull ollama pull codestral # Chat ollama run codestral # Use as completion server (port 11434) curl http://localhost:11434/api/generate -d '{ "model": "codestral", "prompt": "Write a Rust function that returns the Nth Fibonacci number using memoization." }' ``` ### Use Codestral with Continue ```jsonc // config.json in your Continue extension { "models": [ { "title": "Codestral (local)", "provider": "ollama", "model": "codestral", "apiBase": "http://localhost:11434" } ], "tabAutocompleteModel": { "title": "Codestral Autocomplete", "provider": "ollama", "model": "codestral" } } ``` ### Use Codestral with Aider ```bash # Via Mistral API (production) export MISTRAL_API_KEY=your-key aider --model mistral/codestral-latest # Or via local Ollama aider --model ollama/codestral ``` ### Through LiteLLM ```python from litellm import completion response = completion( model="codestral/codestral-latest", messages=[{"role": "user", "content": "Refactor this Python function to use list comprehensions"}], api_key=os.environ["CODESTRAL_API_KEY"], ) ``` ### Codestral vs general Mistral models | Model | Specialty | Best for | |---|---|---| | Mistral Small | General-purpose | Chat, summarization, RAG | | Codestral | Code-only | Tab completion, refactor, code Q&A | | Mixtral 8x7B | Mixture-of-experts | High-throughput inference | --- ### FAQ **Q: Is Codestral free?** A: The Codestral 22B open-weights are free under the Mistral AI Non-Production License (research, non-commercial). For production / commercial use, call Codestral via Mistral's paid API on la-platforme.mistral.ai. **Q: How does Codestral compare to Claude / GPT for coding?** A: On code-specific benchmarks (HumanEval, MBPP) Codestral is competitive with Claude 3.5 Sonnet. For broad reasoning + coding combined, Claude / GPT-4 still lead. Codestral's wins are local deployment and lower latency. **Q: Can I fine-tune Codestral?** A: Yes — Mistral provides LoRA fine-tuning recipes for Codestral. The open-weight version can be fine-tuned with any standard tool (axolotl, unsloth, trl). Mistral's hosted fine-tuning API is also available. --- ## Source & Thanks > Built by [Mistral AI](https://github.com/mistralai). Open-weights under Mistral AI Non-Production License. > > [Codestral on Hugging Face](https://huggingface.co/mistralai/Codestral-22B-v0.1) — ⭐ 1,500+ --- ## 快速使用 1. 本地:`ollama pull codestral && ollama run codestral` 2. API:到 console.mistral.ai 拿 key,然后 `export MISTRAL_API_KEY=...` 3. 接到 Continue / Aider / LiteLLM,用下面的代码片段 --- ## 简介 Codestral 是 Mistral AI 的编码专用模型 —— 220 亿参数、32K 上下文、覆盖 Python / JavaScript / Bash / Fortran 等 80+ 语言。开源权重的 Codestral 22B 在科研和非商用场景免费;生产用最新 Codestral 通过 Mistral API 或 Le Chat 调。适合 Continue / Aider 本地补全、延迟敏感的代码生成。兼容 Ollama / vLLM / llama.cpp / Continue / Aider / LiteLLM。装机时间 5 分钟(ollama pull 或拿 API key)。 --- ### 用 Ollama 本地跑 Codestral ```bash # 拉模型 ollama pull codestral # 聊天 ollama run codestral # 当补全服务跑(端口 11434) curl http://localhost:11434/api/generate -d '{ "model": "codestral", "prompt": "Write a Rust function that returns the Nth Fibonacci number using memoization." }' ``` ### Continue 接 Codestral ```jsonc // Continue 扩展的 config.json { "models": [ { "title": "Codestral (local)", "provider": "ollama", "model": "codestral", "apiBase": "http://localhost:11434" } ], "tabAutocompleteModel": { "title": "Codestral Autocomplete", "provider": "ollama", "model": "codestral" } } ``` ### Aider 接 Codestral ```bash # 走 Mistral API(生产) export MISTRAL_API_KEY=your-key aider --model mistral/codestral-latest # 或走本地 Ollama aider --model ollama/codestral ``` ### 通过 LiteLLM ```python from litellm import completion response = completion( model="codestral/codestral-latest", messages=[{"role": "user", "content": "Refactor this Python function to use list comprehensions"}], api_key=os.environ["CODESTRAL_API_KEY"], ) ``` ### Codestral 和通用 Mistral 模型对比 | 模型 | 专长 | 适合 | |---|---|---| | Mistral Small | 通用 | 聊天、摘要、RAG | | Codestral | 编码专用 | Tab 补全、重构、代码问答 | | Mixtral 8x7B | 专家混合 | 高吞吐推理 | --- ### FAQ **Q: Codestral 免费吗?** A: Codestral 22B 开源权重在 Mistral AI Non-Production License 下免费(科研、非商用)。生产/商用通过 Mistral 的付费 API(la-platforme.mistral.ai)调。 **Q: Codestral 跟 Claude / GPT 在编码上比怎样?** A: 代码专项 benchmark(HumanEval、MBPP)上 Codestral 跟 Claude 3.5 Sonnet 持平。综合推理+编码 Claude / GPT-4 仍领先。Codestral 的胜场是本地部署和低延迟。 **Q: Codestral 能微调吗?** A: 能。Mistral 给了 Codestral 的 LoRA 微调配方。开源权重版本可以用任何标准工具(axolotl / unsloth / trl)微调。Mistral 也提供托管微调 API。 --- ## 来源与感谢 > Built by [Mistral AI](https://github.com/mistralai). Open-weights under Mistral AI Non-Production License. > > [Codestral on Hugging Face](https://huggingface.co/mistralai/Codestral-22B-v0.1) — ⭐ 1,500+ --- Source: https://tokrepo.com/en/workflows/mistral-codestral-22b-open-weight-coding-model Author: Mistral AI