laion/allenai-sera-unified-31600-opt100k__Qwen3-8B
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Apr 1, 2026License:otherArchitecture:Transformer Cold

The laion/allenai-sera-unified-31600-opt100k__Qwen3-8B model is an 8 billion parameter language model, fine-tuned from Qwen/Qwen3-8B. It was trained on the allenai-sera-unified-31600 dataset, suggesting a focus on specific research or domain-specific applications. With a context length of 32768 tokens, it is designed for processing extensive textual inputs. This model is likely intended for tasks benefiting from its specialized fine-tuning and large context window.

Loading preview...