laion/glm46-defects4j-32ep-131k
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kLicense:apache-2.0Architecture:Transformer Open Weights Cold
The laion/glm46-defects4j-32ep-131k model is an 8 billion parameter language model, fine-tuned from Qwen/Qwen3-8B. It was trained on the penfever/glm46-defects4j-32ep-131k dataset, suggesting a specialization in tasks related to software defects or code analysis. With a context length of 32768 tokens, it is designed for processing substantial amounts of text, likely for code-related applications.
Loading preview...