laion/glm46-swesmith-maxeps-131k
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Dec 15, 2025License:apache-2.0Architecture:Transformer Open Weights Cold

The laion/glm46-swesmith-maxeps-131k model is an 8 billion parameter language model, fine-tuned from Qwen/Qwen3-8B. It was trained on the DCAgent2/glm46-neulab-mind2web-sandboxes-maxeps-131k dataset, suggesting a specialization in agentic tasks or interactions within sandbox environments. This model is likely optimized for specific applications requiring nuanced understanding or generation based on its fine-tuning data.

Loading preview...