laion/glm46-code-feedback-maxeps-131k
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kLicense:apache-2.0Architecture:Transformer Open Weights Cold

The laion/glm46-code-feedback-maxeps-131k model is an 8 billion parameter language model, fine-tuned from Qwen/Qwen3-8B. It was specifically trained on the penfever/glm46-code-feedback-maxeps-131k dataset, indicating an optimization for processing and generating code-related feedback. This model is designed for tasks requiring nuanced understanding and generation within coding contexts, leveraging its 32768 token context length.

Loading preview...