RikkiXu/zephyr-7b-sft-full
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Apr 19, 2024License:apache-2.0Architecture:Transformer Open Weights Cold
RikkiXu/zephyr-7b-sft-full is a 7 billion parameter language model fine-tuned from mistralai/Mistral-7B-v0.1. This model is specifically optimized through supervised fine-tuning (SFT) on a generator dataset, aiming to enhance its text generation capabilities. It is suitable for applications requiring robust and coherent text output, building upon the strong foundation of the Mistral 7B architecture.
Loading preview...