laion/syh-r2eg-askl-glm_4-7_trac_jupi_-gfi-swes-rand-filt-10K_glm_4-7_trac_jupi_32B
TEXT GENERATIONConcurrency Cost:2Model Size:32BQuant:FP8Ctx Length:32kPublished:Mar 4, 2026License:otherArchitecture:Transformer Cold
The laion/syh-r2eg-askl-glm_4-7_trac_jupi_-gfi-swes-rand-filt-10K_glm_4-7_trac_jupi_32B is a 32 billion parameter language model fine-tuned from Qwen/Qwen3-32B. It was trained on specific datasets related to 'exp-syh-r2egym-askllm-constrained_glm_4.7_traces_jupiter_cleaned' and 'exp-gfi-swesmith-random-filtered-10K_glm_4.7_traces_jupiter'. This model is likely specialized for tasks related to the data it was fine-tuned on, potentially involving constrained language generation or specific trace analysis, with a context length of 32768 tokens.
Loading preview...