prithivMLmods/Pegasus-Opus-14B-Exp
TEXT GENERATIONConcurrency Cost:1Model Size:14.8BQuant:FP8Ctx Length:32kPublished:Feb 26, 2025License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

Pegasus-Opus-14B-Exp is a 14.8 billion parameter language model developed by prithivMLmods, based on the Qwen 2.5 modality architecture with a 32768 token context length. It is specifically fine-tuned for general-purpose reasoning, excelling in contextual understanding, logical deduction, and multi-step problem-solving. The model supports up to 128K input tokens and 8K output tokens, and offers multilingual proficiency across 29 languages, making it suitable for complex analytical and conversational AI applications.

Loading preview...