laion/GLM-4_7-r2egym_sandboxes-maxeps-131k
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Feb 8, 2026License:apache-2.0Architecture:Transformer Open Weights Cold
The laion/GLM-4_7-r2egym_sandboxes-maxeps-131k model is an 8 billion parameter language model, fine-tuned from Qwen/Qwen3-8B. It is specifically adapted using the DCAgent2/GLM-4.7-r2egym_sandboxes-maxeps-131k dataset. This model is optimized for tasks related to the r2egym sandboxes environment, suggesting a specialization in reinforcement learning or agent-based interactions within simulated environments.
Loading preview...