Yhyu13/LMCocktail-Mistral-7B-v1
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:8kPublished:Dec 27, 2023License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

Yhyu13/LMCocktail-Mistral-7B-v1 is a 7 billion parameter language model based on the Mistral architecture, created by Yhyu13. This model is a 50%-50% merge of Mistral-7B-Instruct-v0.2 and xDAN-L1-Chat-RL-v1, leveraging a novel LM-cocktail merging technique. It demonstrates strong performance in conversational tasks, notably ranking highly on AlpacaEval, making it suitable for general-purpose chat and instruction-following applications.

Loading preview...

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p