ik-ram28/SFT-Mistral-instruct-CPT-7b-New
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Dec 14, 2025Architecture:Transformer Cold

The ik-ram28/SFT-Mistral-instruct-CPT-7b-New is a 7 billion parameter instruction-tuned language model based on the Mistral architecture. Developed by ik-ram28, this model is designed for general language understanding and generation tasks. Its instruction-tuned nature makes it suitable for following user prompts and performing various conversational or text-based applications. The model has a context length of 4096 tokens.

Loading preview...