laion/exp-psu-swesmith-1K_glm_4-7_traces_jupiter__Qwen3-8B
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Mar 11, 2026License:otherArchitecture:Transformer Cold

The laion/exp-psu-swesmith-1K_glm_4-7_traces_jupiter__Qwen3-8B model is an 8 billion parameter language model, fine-tuned from the Qwen3-8B architecture. It was trained on the /e/data1/datasets/playground/ot/hf_hub/datasets--DCAgent--exp-psu-swesmith-1K_glm_4.7_traces_jupiter/snapshots/24c8342833108c3a15a23b64f37b83ff7e65efa4_thinking_preprocessed dataset. This model is a specialized fine-tune, with its primary differentiation stemming from its specific training data and process, rather than broad general-purpose capabilities.

Loading preview...