laion/GLM-4_7-stackexchange-tezos-sandboxes-maxeps-131k
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Feb 8, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

The laion/GLM-4_7-stackexchange-tezos-sandboxes-maxeps-131k model is an 8 billion parameter language model, fine-tuned from Qwen/Qwen3-8B. It was specifically trained on the DCAgent2/GLM-4.7-stackexchange-tezos-sandboxes-maxeps-131k dataset, indicating a specialization in content related to Tezos sandboxes and StackExchange discussions. This model is optimized for tasks requiring understanding and generation within these specific technical domains, leveraging its 32768 token context length for detailed analysis.

Loading preview...