Model Overview
The ik-ram28/SFT-Biomistral-7B-New is a 7 billion parameter language model based on the Mistral architecture. This model has been pushed to the Hugging Face Hub, indicating it is a pre-trained or fine-tuned variant ready for use with the transformers library.
Key Characteristics
- Model Family: Mistral-based architecture.
- Parameter Count: 7 billion parameters.
- Context Length: 4096 tokens.
Usage and Limitations
The provided model card indicates that specific details regarding its development, funding, language support, license, and finetuning origins are currently marked as "More Information Needed." Consequently, its direct use cases, downstream applications, and out-of-scope uses are not explicitly defined. Users should be aware of potential biases, risks, and limitations, as these are also awaiting further documentation.
Training Details
Information regarding the training data, preprocessing, hyperparameters, and evaluation metrics is not yet available. This means specific performance benchmarks or specialized capabilities are not detailed in the current documentation.