anujjamwal/OpenMath-Nemotron-1.5B-PruneAware-2
TEXT GENERATIONConcurrency Cost:1Model Size:1.5BQuant:BF16Ctx Length:32kPublished:Mar 11, 2026Architecture:Transformer Warm

OpenMath-Nemotron-1.5B-PruneAware-2 is a 1.5 billion parameter language model developed by anujjamwal, fine-tuned from an existing OpenMath-Nemotron-1.5B-PruneAware-2 base model. This model was trained using the TRL framework, indicating a focus on reinforcement learning from human feedback or similar fine-tuning techniques. With a context length of 32768 tokens, it is designed for tasks requiring extensive contextual understanding. Its fine-tuned nature suggests specialized performance, likely in mathematical reasoning or related domains given its 'OpenMath' designation.

Loading preview...