laion/r2egym-unified-1000__Qwen3-8B
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Mar 25, 2026License:otherArchitecture:Transformer Cold
The laion/r2egym-unified-1000__Qwen3-8B is an 8 billion parameter language model, fine-tuned from Qwen/Qwen3-8B. This model was specifically trained on the r2egym-unified-1000 dataset, suggesting an optimization for tasks related to the content of this dataset. With a context length of 32768 tokens, it is designed for applications requiring processing of extensive input sequences.
Loading preview...