MiniMoog/Mergerix-7b-v0.1
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:8kPublished:Apr 2, 2024License:apache-2.0Architecture:Transformer Open Weights Cold
Mergerix-7b-v0.1 by MiniMoog is a 7 billion parameter merged language model, combining Kukedlc/NeuralSirKrishna-7b-DPO and Kukedlc/NeuralAlgo-7B-DPO. This model leverages a slerp merge method with specific parameter weighting for self-attention and MLP layers, offering a unique blend of capabilities from its constituent DPO-tuned models. It is designed for general text generation tasks, inheriting the instruction-following and conversational strengths of its base models.
Loading preview...
Popular Sampler Settings
Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.
temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p