MaziyarPanahi/Mistral-7B-v0.3
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:May 22, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

The Mistral-7B-v0.3 is a 7 billion parameter causal language model developed by Mistral AI, based on the Mistral-7B-v0.2 architecture. This iteration features an extended vocabulary of 32768 tokens, enhancing its linguistic capabilities. It maintains a 4096 token context length and is designed for general-purpose text generation tasks. This model is suitable for developers seeking a compact yet capable LLM with an expanded vocabulary for diverse applications.

Loading preview...