BredForCompanionship/qwen3-0.6b-warmup
TEXT GENERATIONConcurrency Cost:1Model Size:0.8BQuant:BF16Ctx Length:32kPublished:Mar 4, 2026Architecture:Transformer Warm

The BredForCompanionship/qwen3-0.6b-warmup model is a 0.8 billion parameter language model fine-tuned from Qwen/Qwen3-0.6B-Base. It was trained using the TRL library, focusing on instruction following. With a context length of 32768 tokens, this model is designed for general text generation tasks based on user prompts.

Loading preview...