ik-ram28/SFT-Biomistral-7B-CPT-New

TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Dec 1, 2025Architecture:Transformer Cold

The ik-ram28/SFT-Biomistral-7B-CPT-New is a 7 billion parameter language model with a 4096 token context length. This model is a fine-tuned variant of the Mistral architecture, developed by ik-ram28. Its specific training details and primary differentiators are not explicitly provided in the available documentation, suggesting it is a general-purpose language model derived from a Mistral base.

Loading preview...

Model Overview

The ik-ram28/SFT-Biomistral-7B-CPT-New is a 7 billion parameter language model, likely based on the Mistral architecture, with a context window of 4096 tokens. The model's specific development details, including its training data, fine-tuning objectives, and unique capabilities, are not extensively documented in the provided model card. It appears to be a general-purpose language model, potentially fine-tuned for specific tasks or domains, though these are not specified.

Key Characteristics

  • Parameter Count: 7 billion parameters.
  • Context Length: Supports a context window of 4096 tokens.
  • Base Architecture: Implied to be a variant of the Mistral family, given the model name.

Usage Considerations

Due to the limited information available in the model card, users should be aware of the following:

  • Undefined Use Cases: The model's intended direct and downstream uses are not specified.
  • Unknown Training Data: Details regarding the training data and procedure are not provided, which can impact performance and potential biases.
  • Unspecified Performance: No evaluation results or benchmarks are included, making it difficult to assess its performance on various tasks.

Users are advised to conduct their own evaluations to determine suitability for specific applications, as the model's unique strengths and limitations are not detailed.