laion/glm46-qasper-maxeps-131k
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kLicense:apache-2.0Architecture:Transformer Open Weights Cold
The laion/glm46-qasper-maxeps-131k model is an 8 billion parameter language model fine-tuned from Qwen/Qwen3-8B. It was specifically trained on the penfever/glm46-qasper-maxeps-131k dataset, suggesting an optimization for tasks related to question answering over scientific papers (QASPER). This model is likely intended for information extraction and comprehension within academic or technical documents, leveraging its 32768 token context length for processing longer texts.
Loading preview...