laion/glm46-swesmith-maxeps-131k-fixthink
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Feb 16, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

The laion/glm46-swesmith-maxeps-131k-fixthink model is an 8 billion parameter language model fine-tuned from Qwen/Qwen3-8B. It was specifically trained on the /data/cat/ws/befe330h-befe330h-otagent/huggingface/hub/datasets--penfever--glm46-swesmith-maxeps-131k/snapshots/4d4c2d4a9d21f73870ed31c7bc6028035b3b6ca7_thinking_preprocessed dataset. This model is designed for tasks benefiting from its specialized fine-tuning, offering a 32768 token context window.

Loading preview...