huihui-ai/QwQ-32B-Coder-Fusion-9010
TEXT GENERATIONConcurrency Cost:2Model Size:32.8BQuant:FP8Ctx Length:32kPublished:Nov 29, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Warm
The huihui-ai/QwQ-32B-Coder-Fusion-9010 is a 32.8 billion parameter mixed language model based on the Qwen 2.5 architecture, created by huihui-ai. It combines 90% of the weights from QwQ-32B-Preview-abliterated and 10% from Qwen2.5-Coder-32B-Instruct-abliterated. This experimental fusion aims to leverage the strengths of both base models, particularly for coding tasks, while maintaining usability. It is designed to explore the impact of weight blending ratios on model performance.
Loading preview...