Ma7ee7/Meet7_0.6b_Exp
TEXT GENERATIONConcurrency Cost:1Model Size:0.8BQuant:BF16Ctx Length:32kPublished:Mar 9, 2026License:apache-2.0Architecture:Transformer Open Weights Warm

Ma7ee7/Meet7_0.6b_Exp is an experimental 0.8 billion parameter causal language model, a continued fine-tune of Ma7ee7/Meet7_0.6b. Developed by Ma7ee7, it was trained at a lower learning rate on a 600-sample dataset, optimizing for balanced commonsense and reasoning. This model excels in tasks requiring general intuition like HellaSwag, PIQA, and Winogrande, offering improved consistency over its base model.

Loading preview...