im21/qwen3b-fft-0.6_15
TEXT GENERATIONConcurrency Cost:1Model Size:0.8BQuant:BF16Ctx Length:32kPublished:Oct 28, 2025Architecture:Transformer Warm

The im21/qwen3b-fft-0.6_15 model is a fine-tuned version of the Qwen3-0.6B architecture, developed by im21. This 0.8 billion parameter causal language model, with a context length of 32768 tokens, has been trained using the TRL framework. It is designed for general text generation tasks, leveraging its fine-tuned capabilities to produce coherent and contextually relevant responses.

Loading preview...