CombinHorizon/zetasepic-abliteratedV2-Qwen2.5-32B-Inst-BaseMerge-TIES
TEXT GENERATIONConcurrency Cost:2Model Size:32.8BQuant:FP8Ctx Length:32kPublished:Dec 7, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Warm

CombinHorizon/zetasepic-abliteratedV2-Qwen2.5-32B-Inst-BaseMerge-TIES is a 32.8 billion parameter language model based on the Qwen2.5 architecture, created by CombinHorizon. This model was developed using the TIES merge method, combining the Qwen/Qwen2.5-32B base with zetasepic/Qwen2.5-32B-Instruct-abliterated-v2. It is designed to leverage the strengths of its merged components, offering a substantial context length of 131072 tokens for complex tasks.

Loading preview...