kingofjoy/qwen3_0.6b_summary_v1
TEXT GENERATIONConcurrency Cost:1Model Size:0.8BQuant:BF16Ctx Length:32kLicense:apache-2.0Architecture:Transformer Open Weights Warm
The kingofjoy/qwen3_0.6b_summary_v1 is a 0.8 billion parameter Qwen3-based causal language model developed by kingofjoy. This model was fine-tuned using Unsloth and Huggingface's TRL library, enabling 2x faster training. With a context length of 40960 tokens, it is optimized for efficient performance in summarization tasks.
Loading preview...