LorenaYannnnn/general_reward-Qwen3-0.6B-baseline_cot_only-seed_0
TEXT GENERATIONConcurrency Cost:1Model Size:0.8BQuant:BF16Ctx Length:32kPublished:Mar 16, 2026Architecture:Transformer Warm
The LorenaYannnnn/general_reward-Qwen3-0.6B-baseline_cot_only-seed_0 is a 0.8 billion parameter language model. This model is based on the Qwen3 architecture and is identified as a baseline model with a focus on Chain-of-Thought (CoT) reasoning. It features a context length of 32768 tokens, making it suitable for tasks requiring processing of moderately long inputs. Further specific details regarding its development, training, and intended applications are not provided in the available documentation.
Loading preview...