unsloth/zephyr-sft
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Dec 31, 2023License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

The unsloth/zephyr-sft is a 7 billion parameter language model developed by Unsloth, fine-tuned for instruction following. This model is specifically optimized for efficient finetuning, enabling 1.9x faster training with 19% less memory usage compared to standard methods. It is designed for tasks requiring conversational AI and direct preference optimization (DPO) applications.

Loading preview...