
Qwen2.5-Coder 32B Instruct
Qwen
Qwen2.5-Coder is a powerful coding model, expertly trained on 5.5 trillion tokens across 92 programming languages. With an extensive 128K context window, it's designed for superior code generation, completion, and repair. Beyond excelling at diverse programming tasks, Qwen2.5-Coder also demonstrates impressive mathematical and general knowledge.
Model Specifications
Technical details and capabilities of Qwen2.5-Coder 32B Instruct
Core Specifications
32.0B Parameters
Model size and complexity
5500.0B Training Tokens
Amount of data used in training
128.0K / 128.0K
Input / Output tokens
February 29, 2024
Knowledge cutoff date
September 18, 2024
Release date
Capabilities & License
Performance Insights
Check out how Qwen2.5-Coder 32B Instruct handles various AI tasks through comprehensive benchmark results.
Model Comparison
See how Qwen2.5-Coder 32B Instruct stacks up against other leading models across key performance metrics.
Detailed Benchmarks
Dive deeper into Qwen2.5-Coder 32B Instruct's performance across specific task categories. Expand each section to see detailed metrics and comparisons.
Math
GSM8K
Coding
HumanEval
MBPP
LiveCodeBench
Reasoning
HellaSwag
Knowledge
MATH
MMLU
Hallucination
TruthfulQA
Non categorized
ARC-Challenge
WinoGrande
MMLU-Pro
MMLU-Redux
TheoremQA
Providers Pricing Coming Soon
We're working on gathering comprehensive pricing data from all major providers for Qwen2.5-Coder 32B Instruct. Compare costs across platforms to find the best pricing for your use case.
Share your feedback
Hi, I'm Charlie Palars, the founder of Deepranking.ai. I'm always looking for ways to improve the site and make it more useful for you. You can write me through this form or directly through X at @palarsio.
Your feedback helps us improve our service