RDson/CoderO1-DeepSeekR1-Coder-32B-Preview
TEXT GENERATIONConcurrency Cost:2Model Size:32.8BQuant:FP8Ctx Length:32kArchitecture:Transformer0.0K Warm
RDson/CoderO1-DeepSeekR1-Coder-32B-Preview is a 32.8 billion parameter language model based on the Qwen2.5 architecture, created by RDson through a merge of DeepSeek-R1-Distill-Qwen-32B and Qwen2.5-Coder-32B-Instruct. This model is specifically optimized for code generation and related programming tasks, leveraging its 131072 token context length for handling extensive codebases. It is designed to provide enhanced performance in coding scenarios by combining specialized coding models.
Loading preview...
Popular Sampler Settings
Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.
temperature
–
top_p
–
top_k
–
frequency_penalty
–
presence_penalty
–
repetition_penalty
–
min_p
–