laion/r2egym-unified-3160__Qwen3-8B
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Mar 25, 2026License:otherArchitecture:Transformer Cold

The laion/r2egym-unified-3160__Qwen3-8B is an 8 billion parameter language model, fine-tuned from the Qwen/Qwen3-8B architecture. It was trained on the laion/r2egym-unified-3160 dataset, suggesting a specialization in areas related to the dataset's content. With a 32K context length, it is suitable for tasks requiring extensive contextual understanding.

Loading preview...