uproai/RosMistral-2x7B

TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 21, 2024Architecture:Transformer Cold

uproai/RosMistral-2x7B is a 7 billion parameter language model from uproai, based on the Mistral architecture. This model is deprecated; users are directed to uproai/Rose-2x7B or uproai/Rose-2x7B-GGUF for current versions. It is intended for general language generation tasks, with a context length of 4096 tokens.

Loading preview...

Model Overview

uproai/RosMistral-2x7B is a 7 billion parameter language model developed by uproai, built upon the Mistral architecture. This model is deprecated, and users are strongly advised to consider its successors, uproai/Rose-2x7B or uproai/Rose-2x7B-GGUF, for any new projects or applications.

Key Characteristics

  • Architecture: Based on the Mistral framework.
  • Parameter Count: 7 billion parameters.
  • Context Length: Supports a context window of 4096 tokens.

Usage Recommendation

Given its deprecated status, this model should not be used for new deployments. Developers are encouraged to migrate to the recommended Rose-2x7B or Rose-2x7B-GGUF models, which represent the updated and maintained versions from uproai.