atomwalk12/LinalgZero-SFT
TEXT GENERATIONConcurrency Cost:1Model Size:3.1BQuant:BF16Ctx Length:32kPublished:Dec 10, 2025Architecture:Transformer Warm

atomwalk12/LinalgZero-SFT is a fine-tuned language model developed by atomwalk12, based on atomwalk12/LinalgZero-SFT-LoRA. It was trained using TRL on the atomwalk12/linalgzero-sft dataset. This model is designed for general text generation tasks, demonstrating capabilities in conversational responses.

Loading preview...