laion/r2egym-1000-opt1k__Qwen3-8B
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Mar 27, 2026License:otherArchitecture:Transformer Cold

The laion/r2egym-1000-opt1k__Qwen3-8B model is a fine-tuned 8 billion parameter language model based on the Qwen3 architecture, developed by laion. It was specifically trained on the r2egym-unified-1000 dataset, indicating an optimization for tasks related to reinforcement learning environments or game-like scenarios. With a context length of 32768 tokens, this model is likely tailored for specialized applications requiring deep contextual understanding within its fine-tuned domain.

Loading preview...