laion/exp-uns-r2egym-33_6x_glm_4_7_traces_jupiter
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Feb 20, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

The laion/exp-uns-r2egym-33_6x_glm_4_7_traces_jupiter model is an 8 billion parameter language model, fine-tuned from Qwen/Qwen3-8B. It was trained on the /data/cat/ws/befe330h-befe330h-otagent/huggingface/hub/datasets--DCAgent--exp-uns-r2egym-33_6x_glm_4.7_traces_jupiter/snapshots/9f6fd69f6fa50425609d375c4f7198b192f4a61b_thinking_preprocessed dataset. This model is a specialized fine-tune, with its primary differentiator being its specific training data, suggesting potential optimization for tasks related to that dataset's content.

Loading preview...