Writer/palmyra-mini-thinking-b
TEXT GENERATIONConcurrency Cost:1Model Size:1.5BQuant:BF16Ctx Length:32kPublished:Sep 10, 2025License:apache-2.0Architecture:Transformer0.0K Open Weights Warm

Writer/palmyra-mini-thinking-b is a 1.7 billion parameter causal language model developed by Writer, fine-tuned from nvidia/OpenReasoning-Nemotron-1.5B. This model is specifically optimized for complex reasoning, excelling in mathematical and competitive programming challenges with a 131,072 token context window. It demonstrates strong proficiency in advanced high school mathematics and algorithmic problem-solving, making it suitable for tasks requiring deep, multi-step logical thinking.

Loading preview...