ik-ram28/SFT-Mistral-7B-New
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Nov 26, 2025Architecture:Transformer Cold

The ik-ram28/SFT-Mistral-7B-New is a 7 billion parameter language model based on the Mistral architecture, developed by ik-ram28. This model is a fine-tuned version, building upon the base Mistral-7B model. With a context length of 4096 tokens, it is designed for general language understanding and generation tasks, offering a balance of performance and efficiency for various applications.

Loading preview...