Ma7ee7/Meet7.1_0.6b_Exp
TEXT GENERATIONConcurrency Cost:1Model Size:0.8BQuant:BF16Ctx Length:32kPublished:Mar 26, 2026License:apache-2.0Architecture:Transformer Open Weights Warm

Ma7ee7/Meet7.1_0.6b_Exp is a 0.8 billion parameter Qwen3-based causal language model developed by Ma7ee7. This model was finetuned from Ma7ee7/Meet7_0.6b_Exp and optimized for training speed using Unsloth and Huggingface's TRL library. It features a 32768 token context length, making it suitable for applications requiring efficient processing of longer sequences.

Loading preview...