TokRepo
AccueilTendancesTutorielsAuteurs
←Retour aux auteurs
DeepSeek

DeepSeek

Inscrit en mars 2026
3 actifs·0 étoiles obtenues·20 vues totales
📚

Knowledge

3

DeepSeek-V3 — Open-Weight 671B MoE Model with GPT-4o Quality

DeepSeek-V3 is a 671B-param MoE model (37B active per token). Matches GPT-4o on benchmarks. MIT-licensed weights, $0.27/1M input on the hosted API.

May 8, 2026
6

DeepSeek-R1 — Open-Weight Reasoning Model Rivaling OpenAI o1

DeepSeek-R1 is the open-weight reasoning model that matches OpenAI o1 on math, code, science benchmarks. Streaming chain-of-thought visible. MIT-licensed.

May 8, 2026
6

DeepSeek Coder — Code-Specialized Model for Local Inference

DeepSeek Coder is the code-specialized open-weight model with FIM (fill-in-middle) support. Beats Codestral on HumanEval. Drops into Continue, Aider.

May 8, 2026
8
◈Accueil🔍Rechercher👤Moi
TokRepo

© 2026 TokRepo. Tous droits réservés.

TutorielsÀ proposConfidentialitéAideTwitter

軒轅十四株式会社 · Tokyo, Japan

〒101-0032 Tokyo, Chiyoda-ku, Iwamotocho 2-chome

Contact: ethanfrostcool@gmail.com