laion/allenai-sera-unified-316__Qwen3-8B
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Mar 25, 2026License:otherArchitecture:Transformer0.0K Cold

The laion/allenai-sera-unified-316__Qwen3-8B model is an 8 billion parameter language model, fine-tuned from Qwen/Qwen3-8B. It was trained on the laion/allenai-sera-unified-316 dataset, suggesting a focus on unified or specific data processing tasks. With a context length of 32768 tokens, it is designed for applications requiring extensive contextual understanding. This model is suitable for tasks benefiting from its specialized fine-tuning and large context window.

Loading preview...