theprint/Coma-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:Oct 7, 2025License:apache-2.0Architecture:Transformer Open Weights Cold

Coma-7B is a 7.6 billion parameter language model developed by theprint, based on the Qwen 2.5 7B architecture. It has been GRPO-fine-tuned specifically on Meta's natural reasoning dataset, making it optimized for tasks requiring logical inference and understanding. This model is designed to excel in applications that demand robust natural reasoning capabilities.

Loading preview...