Cette page est affichée en anglais. Une traduction française est en cours.
PromptsApr 7, 2026·2 min de lecture

Granite Code — IBM Open Source AI Coding Models

IBM's open-source code LLMs from 3B to 34B parameters. Trained on 116 programming languages with Apache 2.0 license. Top performer on code benchmarks.

What is Granite Code?

Granite Code is IBM's family of open-source code language models, ranging from 3B to 34B parameters. Trained on 116 programming languages with a deduplicated dataset of 3-4 trillion tokens, these models deliver strong performance on code generation, completion, and explanation tasks.

Answer-Ready: Granite Code is IBM's open-source code LLM family (3B to 34B parameters) trained on 116 programming languages. Apache 2.0 licensed, it ranks among top open-source code models on HumanEval and MBPP benchmarks.

Best for: Teams needing a self-hosted, commercially-licensed code AI. Works with: Ollama, vLLM, Hugging Face, any OpenAI-compatible server. Setup time: Under 5 minutes with Ollama.

Model Variants

Model Parameters Use Case
granite-3b-code 3B Edge devices, fast completion
granite-8b-code 8B General coding, best speed/quality balance
granite-20b-code 20B Complex code generation
granite-34b-code 34B Maximum quality, multi-file tasks

Each has -base (completion) and -instruct (chat/instruction) variants.

Key Strengths

1. 116 Languages

Python, JavaScript, TypeScript, Java, Go, Rust, C/C++, SQL, and 108 more.

2. Apache 2.0 License

Fully open for commercial use — no restrictions, no fees, no CLA.

3. Benchmark Performance

Granite 34B-code competes with Code Llama 34B and StarCoder2 15B on:

  • HumanEval: Python code generation
  • MBPP: Multi-language programming
  • DS-1000: Data science tasks

4. Fill-in-the-Middle

Supports FIM for IDE-style code completion:

<fim_prefix>def add(a, b):
    <fim_suffix>
    return result<fim_middle>

5. Long Context

Supports up to 128K tokens for large codebase understanding.

Deployment Options

# Ollama (easiest)
ollama serve
ollama pull granite-code:8b

# vLLM (production)
pip install vllm
vllm serve ibm-granite/granite-8b-code-instruct

# Continue.dev (IDE integration)
# Add to ~/.continue/config.json

FAQ

Q: How does it compare to Code Llama? A: Granite 34B matches or exceeds Code Llama 34B on most benchmarks, with a more permissive Apache 2.0 license (vs Llama's custom license).

Q: Can I use it commercially? A: Yes, Apache 2.0 — no restrictions on commercial use.

Q: Best model size for coding? A: 8B offers the best speed/quality tradeoff. Use 34B for complex multi-file tasks.

🙏

Source et remerciements

Created by IBM Research. Licensed under Apache 2.0.

ibm-granite/granite-code-models — 2k+ stars

Discussion

Connectez-vous pour rejoindre la discussion.
Aucun commentaire pour l'instant. Soyez le premier à partager votre avis.