hZzy/mistral-7b-sft-7b-submission-win
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 28, 2026Architecture:Transformer Cold

hZzy/mistral-7b-sft-7b-submission-win is a 7 billion parameter language model fine-tuned from Mistral-7B-Instruct-v0.3. This model was trained using Supervised Fine-Tuning (SFT) with the TRL library, focusing on instruction-following tasks. It is designed for general text generation based on user prompts, leveraging the base model's strong performance.

Loading preview...