Qwen/Qwen2.5-Coder-32B
TEXT GENERATIONConcurrency Cost:2Model Size:32.8BQuant:FP8Ctx Length:32kPublished:Nov 8, 2024License:apache-2.0Architecture:Transformer0.1K Open Weights Warm
Qwen/Qwen2.5-Coder-32B is a 32.5 billion parameter causal language model from the Qwen2.5-Coder series, developed by Qwen. This pre-trained model is specifically optimized for advanced code generation, reasoning, and fixing, building upon the Qwen2.5 architecture with 5.5 trillion training tokens including extensive source code. It features a full 131,072 token context length and is designed for real-world code agent applications while maintaining strong general and mathematical capabilities.
Loading preview...
Popular Sampler Settings
Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.
temperature
top_p
top_k
–
frequency_penalty
–
presence_penalty
–
repetition_penalty
–
min_p
–