qtaka/gensyn-checkpoints-grazing_noisy_ladybug
TEXT GENERATIONConcurrency Cost:1Model Size:0.5BQuant:BF16Ctx Length:32kPublished:Apr 20, 2025Architecture:Transformer Warm

The qtaka/gensyn-checkpoints-grazing_noisy_ladybug is a 0.5 billion parameter instruction-tuned language model, fine-tuned from Gensyn/Qwen2.5-1.5B-Instruct. This model was trained using the GRPO method, which is designed to enhance mathematical reasoning capabilities. It is suitable for tasks requiring advanced reasoning, particularly in mathematical contexts, leveraging its Qwen2.5 base architecture and specialized training.

Loading preview...