prithivMLmods/Gaea-Opus-14B-Exp
TEXT GENERATIONConcurrency Cost:1Model Size:14.8BQuant:FP8Ctx Length:32kPublished:Mar 11, 2025License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

Gaea-Opus-14B-Exp is a 14.8 billion parameter language model developed by prithivMLmods, based on the Qwen 2.5 architecture. It is specifically optimized for general-purpose reasoning, contextual understanding, and multi-step problem-solving, leveraging fine-tuning with long chain-of-thought reasoning and specialized datasets. The model supports a 32768-token context length for input and can generate up to 8K tokens in output, making it suitable for detailed and structured responses across over 29 languages.

Loading preview...