ertghiu256/Qwen3-4b-tcomanr-merge-v2.2
TEXT GENERATIONConcurrency Cost:1Model Size:4BQuant:BF16Ctx Length:32kPublished:Aug 23, 2025Architecture:Transformer0.0K Cold
The ertghiu256/Qwen3-4b-tcomanr-merge-v2.2 is a 4 billion parameter language model based on the Qwen3 architecture, developed by ertghiu256. This model is a TIES merge of multiple Qwen3 finetunes, specifically designed to enhance capabilities in code, mathematics, and general reasoning tasks. It leverages a 32768 token context length, making it suitable for complex problem-solving and detailed analytical applications. The model aims to provide a robust foundation for tasks requiring strong logical and computational understanding.
Loading preview...