cyirr/finetunecoder
TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:Apr 9, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

The cyirr/finetunecoder is a 7.6 billion parameter Qwen2 model developed by cyirr, finetuned from unsloth/deepseek-r1-distill-qwen-7b-unsloth-bnb-4bit. This model was optimized for faster training using Unsloth and Huggingface's TRL library, offering a 32768 token context length. It is designed for general language tasks, leveraging its efficient training methodology.

Loading preview...