ik-ram28/SFT-Biomistral-7B-New

TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Dec 1, 2025Architecture:Transformer Cold

The ik-ram28/SFT-Biomistral-7B-New is a 7 billion parameter language model. This model is a fine-tuned variant of the Mistral architecture. It is designed for general language understanding and generation tasks. Further details on its specific differentiators or primary use cases are not provided in the available documentation.

Loading preview...

Model Overview

The ik-ram28/SFT-Biomistral-7B-New is a 7 billion parameter language model based on the Mistral architecture. This model has been pushed to the Hugging Face Hub, indicating it is a pre-trained or fine-tuned variant ready for use with the transformers library.

Key Characteristics

  • Model Family: Mistral-based architecture.
  • Parameter Count: 7 billion parameters.
  • Context Length: 4096 tokens.

Usage and Limitations

The provided model card indicates that specific details regarding its development, funding, language support, license, and finetuning origins are currently marked as "More Information Needed." Consequently, its direct use cases, downstream applications, and out-of-scope uses are not explicitly defined. Users should be aware of potential biases, risks, and limitations, as these are also awaiting further documentation.

Training Details

Information regarding the training data, preprocessing, hyperparameters, and evaluation metrics is not yet available. This means specific performance benchmarks or specialized capabilities are not detailed in the current documentation.