mlabonne/NeuralDarewin-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:8kPublished:Jan 23, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

NeuralDarewin-7B by mlabonne is a 7 billion parameter language model merged from multiple Mistral-based models, including Intel/neural-chat-7b-v3-3 and openaccess-ai-collective/DPOpenHermes-7B-v2, using the dare_ties method. This model is designed for general-purpose conversational AI and instruction following, leveraging the strengths of its constituent models. It achieves an average score of 71.79 on the Open LLM Leaderboard, demonstrating strong performance across various reasoning and language understanding tasks.

Loading preview...

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p