sm54/FuseO1-QwQ-SkyT1-Flash-32B is a 32.8 billion parameter language model created by sm54, formed by merging Qwen/QwQ-32B and NovaSky-AI/Sky-T1-32B-Flash using the sce merge method with Qwen/Qwen2.5-32B as the base. This model leverages the strengths of its constituent models, offering a combined capability for general language tasks. With a substantial 131,072 token context length, it is well-suited for applications requiring extensive contextual understanding and generation.
No reviews yet. Be the first to review!