OrbitMC/qwen
TEXT GENERATIONConcurrency Cost:1Model Size:0.8BQuant:BF16Ctx Length:32kPublished:Mar 27, 2026License:apache-2.0Architecture:Transformer Open Weights Warm

OrbitMC/qwen is a 0.8 billion parameter Qwen3-based causal language model developed by OrbitMC, fine-tuned from unsloth/qwen3-0.6b-unsloth-bnb-4bit. This model was trained using Unsloth and Huggingface's TRL library, achieving 2x faster training speeds. It is designed for general language tasks, leveraging its efficient training methodology.

Loading preview...