ik-ram28/SFT-Mistral-Instruct-chat-7B-New
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Nov 10, 2025Architecture:Transformer Cold
The ik-ram28/SFT-Mistral-Instruct-chat-7B-New is a 7 billion parameter instruction-tuned causal language model based on the Mistral architecture. Developed by ik-ram28, this model is designed for chat-based applications and instruction following. With a context length of 4096 tokens, it aims to provide conversational capabilities for various use cases. Its primary strength lies in its ability to process and respond to user instructions effectively within a chat environment.
Loading preview...