EleutherAI/Mistral-7B-v0.1-addition-first-ft
EleutherAI/Mistral-7B-v0.1-addition-first-ft is a 7 billion parameter language model developed by EleutherAI, fine-tuned from the Mistral-7B-v0.1 architecture. This model has a context length of 4096 tokens. Specific details regarding its primary differentiators, training, and intended use cases are not provided in the available model card.
Loading preview...
Model Overview
This model, EleutherAI/Mistral-7B-v0.1-addition-first-ft, is a 7 billion parameter language model based on the Mistral-7B-v0.1 architecture. It was developed by EleutherAI and has a context length of 4096 tokens.
Key Characteristics
- Model Type: 7 billion parameter language model.
- Base Model: Fine-tuned from Mistral-7B-v0.1.
- Context Length: Supports a context window of 4096 tokens.
Limitations and Further Information
The provided model card indicates that significant details regarding the model's specific training data, evaluation results, intended uses, biases, risks, and technical specifications are currently marked as "More Information Needed." Users should be aware of these limitations and exercise caution, as the model's specific capabilities and performance characteristics are not yet documented. Further recommendations regarding its use cannot be provided without additional information from the developers.