Ba2han/qwen_augment-inst
TEXT GENERATIONConcurrency Cost:1Model Size:4BQuant:BF16Ctx Length:32kPublished:Feb 3, 2026License:apache-2.0Architecture:Transformer Open Weights Warm
The Ba2han/qwen_augment-inst is a 4 billion parameter instruction-tuned causal language model developed by Ba2han, fine-tuned from Ba2han/qwen-augment-2511. This model was trained using Unsloth and Huggingface's TRL library, enabling 2x faster training. With a notable context length of 40960 tokens, it is optimized for efficient and rapid deployment in applications requiring a Qwen3-based architecture.
Loading preview...