laion/glm-4_6-nemo-prism
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kLicense:apache-2.0Architecture:Transformer Open Weights Cold
The laion/glm-4_6-nemo-prism is an 8 billion parameter language model, fine-tuned from Qwen/Qwen3-8B. This model is specifically adapted using the penfever/glm-4.6-nemo-prism dataset, suggesting a specialization derived from its training data. With a context length of 32768 tokens, it is designed for tasks benefiting from extensive contextual understanding.
Loading preview...