laion/exp-syh-r2egym-swesmith-mixed_glm_4_7_traces_jupiter_cleaned
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Feb 27, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

The laion/exp-syh-r2egym-swesmith-mixed_glm_4_7_traces_jupiter_cleaned model is an 8 billion parameter language model, fine-tuned from Qwen/Qwen3-8B. It was trained on the /data/cat/ws/befe330h-befe330h-otagent/huggingface/hub/datasets--DCAgent--exp-syh-r2egym-swesmith-mixed_glm_4.7_traces_jupiter_cleaned/snapshots/6bda9bf636a815d9ffd0a001e1a602b93c883472_thinking_preprocessed dataset. This model is designed for general language understanding and generation tasks, leveraging its Qwen3-8B base and specific fine-tuning data.

Loading preview...