atomwalk12/LinalgZero-GRPO-merged
TEXT GENERATIONConcurrency Cost:1Model Size:3.1BQuant:BF16Ctx Length:32kPublished:Mar 10, 2026Architecture:Transformer Warm
LinalgZero-GRPO-merged is a fine-tuned language model developed by atomwalk12, based on the LinalgZero-SFT architecture. This model was fine-tuned using the GSPO algorithm on the linalgzero-grpo dataset. It is specifically optimized for tasks related to the LinalgZero project, leveraging training with ART.
Loading preview...