huihui-ai/DeepSeek-R1-Distill-Qwen-Coder-32B-Fusion-9010
TEXT GENERATIONConcurrency Cost:2Model Size:32BQuant:FP8Ctx Length:32kArchitecture:Transformer0.0K Cold

huihui-ai/DeepSeek-R1-Distill-Qwen-Coder-32B-Fusion-9010 is a 32 billion parameter mixed model based on the Qwen2.5 architecture, created by huihui-ai. This experimental fusion combines DeepSeek-R1-Distill-Qwen-32B and Qwen2.5-Coder-32B-Instruct to enhance programming and code-related thinking abilities. It is designed for applications requiring robust code generation and understanding, leveraging the strengths of its constituent models.

Loading preview...