laion/sera-1000-opt1k__Qwen3-8B
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Mar 28, 2026License:otherArchitecture:Transformer Cold
The laion/sera-1000-opt1k__Qwen3-8B is an 8 billion parameter language model, fine-tuned from Qwen/Qwen3-8B. This model was trained on the /e/data1/datasets/playground/ot/hf_hub/datasets--laion--allenai-sera-unified-1000/snapshots/f5fa11a5ed32c60ee913b2355c2bfa56a592eca0_thinking_preprocessed dataset, suggesting a specialization in tasks related to reasoning or complex thought processes. With a 32K context length, it is suitable for applications requiring extensive contextual understanding.
Loading preview...