hello7687/qwen-mina-merged-16bit
TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:Mar 14, 2026License:apache-2.0Architecture:Transformer Open Weights Cold
The hello7687/qwen-mina-merged-16bit is a 7.6 billion parameter Qwen2-based causal language model developed by hello7687. This model was finetuned using Unsloth and Huggingface's TRL library, enabling faster training. It is designed for general language generation tasks, leveraging its Qwen2 architecture and efficient finetuning process.
Loading preview...