penfever/glm46-ling-coder-sft-sandboxes-1-maxeps-131k
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Dec 19, 2025Architecture:Transformer Cold

The penfever/glm46-ling-coder-sft-sandboxes-1-maxeps-131k is an 8 billion parameter language model, trained from scratch, with a context length of 32768 tokens. This model was developed by penfever and is characterized by its training procedure using specific hyperparameters like a learning rate of 4e-05 and a cosine learning rate scheduler. Its primary characteristics are derived from its training configuration, making it suitable for tasks aligned with its foundational training.

Loading preview...