laion/exp_tas_full_thinking_traces
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Dec 31, 2025License:apache-2.0Architecture:Transformer Open Weights Cold
The laion/exp_tas_full_thinking_traces model is an 8 billion parameter language model fine-tuned from Qwen/Qwen3-8B. It was trained on the DCAgent/exp_tas_full_thinking_traces dataset, suggesting a specialization in processing or generating 'thinking traces' for agents. This model is likely optimized for tasks requiring detailed sequential reasoning or internal thought process simulation.
Loading preview...