laion/exp-syh-r2egym-swesmith-mixed_glm_4_7_traces_locetash
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Feb 9, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

The laion/exp-syh-r2egym-swesmith-mixed_glm_4_7_traces_locetash model is an 8 billion parameter language model, fine-tuned from Qwen/Qwen3-8B. It was trained on the DCAgent/exp-syh-r2egym-swesmith-mixed_glm_4.7_traces_locetash dataset with a context length of 32768 tokens. This model is a specialized adaptation of the Qwen3-8B architecture, intended for tasks aligned with its specific fine-tuning dataset.

Loading preview...