atomwalk12/LinalgZero-SFT-110-checkpoint-300

Warm
Public
3.1B
BF16
32768
1
Dec 3, 2025
Hugging Face

The atomwalk12/LinalgZero-SFT-110-checkpoint-300 is a 3.1 billion parameter language model with a 32768 token context length. This model is a fine-tuned version, though specific details on its architecture, training, and primary differentiators are not provided in the available documentation. Its intended use cases and unique strengths compared to other models are currently unspecified.

No reviews yet. Be the first to review!