laion/GLM-4_7-inferredbugs-sandboxes-maxeps-131k
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Feb 8, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

The laion/GLM-4_7-inferredbugs-sandboxes-maxeps-131k model is a fine-tuned 8 billion parameter variant of Qwen/Qwen3-8B, developed by laion. This model has been specifically adapted using the DCAgent2/GLM-4.7-inferredbugs-sandboxes-maxeps-131k dataset. With a context length of 32768 tokens, it is designed for specialized applications related to its fine-tuning data. Its primary differentiator lies in its targeted fine-tuning for specific inferred bug sandboxes.

Loading preview...