argilla/notus-7b-v1
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:8kPublished:Nov 16, 2023License:mitArchitecture:Transformer0.1K Open Weights Cold
Argilla's Notus-7b-v1 is a 7 billion parameter GPT-like causal language model, fine-tuned using Direct Preference Optimization (DPO) on a curated version of the UltraFeedback dataset. This model, based on Zephyr-7b-sft-full, is optimized for chat applications and assistant-like interactions. It demonstrates competitive performance, surpassing Zephyr-7B-beta and Claude 2 on the AlpacaEval benchmark, making it suitable for high-quality conversational AI.
Loading preview...
Popular Sampler Settings
Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.
temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p