win10/karcher-test-32b
TEXT GENERATIONConcurrency Cost:2Model Size:32.8BQuant:FP8Ctx Length:32kPublished:Apr 11, 2025Architecture:Transformer0.0K Cold

The win10/karcher-test-32b is a 32.8 billion parameter language model created by win10, formed by merging four pre-trained models using the Karcher Mean method. This merge combines models like OpenThinker2-32B, QwQ-32B-abliterated, Snowflake/Qwen-2.5-coder-Arctic-ExCoT-32B, and Qwen/Qwen2.5-Coder-32B-Instruct. It leverages the strengths of its constituent models, particularly those focused on coding, to offer a versatile foundation for various generative AI tasks with a 32768 token context length.

Loading preview...