rombodawg/Rombos-Coder-V2.5-Qwen-32b
TEXT GENERATIONConcurrency Cost:2Model Size:32.8BQuant:FP8Ctx Length:32kPublished:Nov 12, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Warm

Rombos-Coder-V2.5-Qwen-32b is a 32.8 billion parameter language model developed by rombodawg, based on the Qwen2.5-Coder architecture. This model is a continuously fine-tuned version of Qwen2.5-Coder-32B-Instruct, created by merging the instruct and base models using the Ties method. It is optimized for coding tasks, demonstrating higher performance than its original instruct and base counterparts.

Loading preview...