g4me/QwenRolina3-Base-LR1e5-b32g2gc8-wsd-order-domain
TEXT GENERATIONConcurrency Cost:1Model Size:2BQuant:BF16Ctx Length:32kPublished:Mar 2, 2026Architecture:Transformer Gated Cold

g4me/QwenRolina3-Base-LR1e5-b32g2gc8-wsd-order-domain is a 2 billion parameter language model, fine-tuned from Qwen/Qwen3-1.7B-Base. Developed by g4me, this model is trained using SFT with TRL and supports a context length of 32768 tokens. It is designed for general text generation tasks, leveraging its base Qwen3 architecture for broad applicability.

Loading preview...