anujjamwal/OpenMath-Nemotron-1.5B-PruneAgnostic
TEXT GENERATIONConcurrency Cost:1Model Size:1.5BQuant:BF16Ctx Length:32kPublished:Mar 5, 2026Architecture:Transformer Warm
anujjamwal/OpenMath-Nemotron-1.5B-PruneAgnostic is a 1.5 billion parameter language model, fine-tuned by anujjamwal, based on the Nemotron architecture. This model is specifically trained using Supervised Fine-Tuning (SFT) with the TRL framework, indicating an optimization for specific task performance rather than broad general-purpose capabilities. Its fine-tuned nature suggests suitability for applications requiring specialized text generation or understanding within its training domain.
Loading preview...