laion/Kimi-2-5-r2egym_sandboxes-maxeps-32k__Qwen3-8B
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Mar 11, 2026License:otherArchitecture:Transformer Cold

The laion/Kimi-2-5-r2egym_sandboxes-maxeps-32k__Qwen3-8B model is an 8 billion parameter language model fine-tuned from Qwen/Qwen3-8B. It was trained on the Kimi-2.5-r2egym_sandboxes-maxeps-32k dataset, suggesting a specialization in environments related to reinforcement learning or sandbox simulations. This model is optimized for tasks requiring understanding and generation within specific simulated or game-like contexts, leveraging its 32k token context length.

Loading preview...