hmuegyi/qwen2.5-en-my-opus100
TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:Feb 18, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

The hmuegyi/qwen2.5-en-my-opus100 is a 7.6 billion parameter Qwen2.5 model, developed by hmuegyi and finetuned from unsloth/qwen2.5-7b-bnb-4bit. This model was specifically optimized for faster training using Unsloth and Huggingface's TRL library. It is designed for general language tasks, leveraging its Qwen2.5 architecture and 32768 token context length.

Loading preview...