prithivMLmods/Primal-Opus-14B-Optimus-v2
TEXT GENERATIONConcurrency Cost:1Model Size:14.8BQuant:FP8Ctx Length:32kPublished:Feb 27, 2025License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

Primal-Opus-14B-Optimus-v2 by prithivMLmods is a 14.8 billion parameter language model based on the Qwen 2.5 architecture, fine-tuned on a synthetic dataset derived from DeepSeek R1. It is specifically optimized for enhanced chain-of-thought (CoT) reasoning, logical problem-solving, and structured data processing. The model supports a 128K token context window and excels in complex reasoning tasks, instruction-following, and multilingual applications across 29 languages.

Loading preview...