prithivMLmods/Porpoise-Opus-14B-Exp
TEXT GENERATIONConcurrency Cost:1Model Size:14.8BQuant:FP8Ctx Length:32kPublished:Feb 26, 2025License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

Porpoise-Opus-14B-Exp is a 14.8 billion parameter language model developed by prithivMLmods, based on the Qwen 2.5 architecture. Optimized for general-purpose reasoning and answering, it excels in contextual understanding, logical deduction, and multi-step problem-solving. This model supports a 128K token input context and 8K token output, along with multilingual proficiency across 29 languages. It is primarily designed for applications requiring enhanced reasoning, instruction following, and long-context processing.

Loading preview...