princeton-nlp/Mistral-7B-Base-SFT-SLiC-HF
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jul 6, 2024Architecture:Transformer Cold
The princeton-nlp/Mistral-7B-Base-SFT-SLiC-HF model is a 7 billion parameter language model based on the Mistral architecture, fine-tuned using the SLiC method. Developed by Princeton NLP, this model is derived from research on SimPO (Simple Preference Optimization with a Reference-Free Reward). It is designed for general language generation tasks, leveraging its 4096-token context length.
Loading preview...