ik-ram28/SFT-Biomistral-7B-CPT-New
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Dec 1, 2025Architecture:Transformer Cold

The ik-ram28/SFT-Biomistral-7B-CPT-New is a 7 billion parameter language model with a 4096 token context length. This model is a fine-tuned variant of the Mistral architecture, developed by ik-ram28. Its specific training details and primary differentiators are not explicitly provided in the available documentation, suggesting it is a general-purpose language model derived from a Mistral base.

Loading preview...