laion/exp-uns-tezos-128unique_glm_4_7_traces_jupiter_cleaned
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Feb 27, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

The laion/exp-uns-tezos-128unique_glm_4_7_traces_jupiter_cleaned model is an 8 billion parameter language model, fine-tuned from Qwen/Qwen3-8B. It was trained on a specific dataset related to Tezos traces, suggesting a specialization in processing or generating content relevant to blockchain data or similar structured information. With a context length of 32768 tokens, it is designed for tasks requiring extensive contextual understanding within its specialized domain.

Loading preview...