bunnycore/QwQen-3B-LCoT-R1
TEXT GENERATIONConcurrency Cost:1Model Size:3.1BQuant:BF16Ctx Length:32kPublished:Feb 23, 2025Architecture:Transformer0.0K Warm

The bunnycore/QwQen-3B-LCoT-R1 is a 3.1 billion parameter language model, based on the Qwen architecture, designed with a focus on enhancing reasoning capabilities through a "chain-of-thought" (LCoT) approach. It utilizes a specific system prompt to encourage an explicit reasoning process before generating an answer, making it suitable for tasks requiring structured thought. The model has a context length of 32768 tokens and is optimized for tasks where transparent, step-by-step reasoning is beneficial, despite exhibiting some tendencies for repetitive output which can be mitigated with repetition penalties and temperature adjustments.

Loading preview...

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p