LorenaYannnnn/general_reward-Qwen3-0.6B-baseline_cot_only-seed_2
TEXT GENERATIONConcurrency Cost:1Model Size:0.8BQuant:BF16Ctx Length:32kPublished:Mar 15, 2026Architecture:Transformer Warm
The LorenaYannnnn/general_reward-Qwen3-0.6B-baseline_cot_only-seed_2 is a 0.8 billion parameter language model based on the Qwen3 architecture. This model is a baseline version, specifically fine-tuned with a Chain-of-Thought (CoT) approach, indicating an optimization for reasoning tasks. With a context length of 32768 tokens, it is designed for applications requiring processing of extensive input sequences. Its primary strength lies in its CoT training, suggesting enhanced performance in complex problem-solving and logical deduction.
Loading preview...
Popular Sampler Settings
Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.
temperature
–
top_p
–
top_k
–
frequency_penalty
–
presence_penalty
–
repetition_penalty
–
min_p
–