laion/exp-syh-tezos-stackoverflow-mixed_glm_4_7_traces_jupiter_cleaned
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Feb 27, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

The laion/exp-syh-tezos-stackoverflow-mixed_glm_4_7_traces_jupiter_cleaned model is an 8 billion parameter language model, fine-tuned from Qwen/Qwen3-8B. This model is specifically adapted using a dataset derived from Tezos and Stack Overflow traces, suggesting an optimization for understanding and generating content related to blockchain development and technical Q&A. Its fine-tuning on specialized data aims to enhance performance in domain-specific contexts, particularly for technical information retrieval and generation.

Loading preview...