laion/exp-syh-r2egym-swesmith-mixed_glm_4_7_traces_jupiter
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Feb 21, 2026License:apache-2.0Architecture:Transformer Open Weights Cold
The laion/exp-syh-r2egym-swesmith-mixed_glm_4_7_traces_jupiter model is an 8 billion parameter language model fine-tuned from Qwen/Qwen3-8B. It was trained on the /data/cat/ws/befe330h-befe330h-otagent/huggingface/hub/datasets--DCAgent--exp-syh-r2egym-swesmith-mixed_glm_4.7_traces_jupiter/snapshots/97638d480d61a3575e634d808606a58bfc6a0f9e_thinking_preprocessed dataset. This model is a specialized fine-tune, with its primary differentiator being its specific training data and hyperparameters, suggesting a focus on tasks related to the nature of that dataset.
Loading preview...