saucam/mistral-orpo-beta-NeuralBeagle14-7B-dare-ties
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 16, 2024License:apache-2.0Architecture:Transformer Open Weights Cold
saucam/mistral-orpo-beta-NeuralBeagle14-7B-dare-ties is a 7 billion parameter language model created by saucam, formed by merging kaist-ai/mistral-orpo-beta and mlabonne/NeuralBeagle14-7B using the DARE TIES method. This model combines the strengths of its base components, leveraging the ORPO fine-tuning approach from kaist-ai/mistral-orpo-beta and the general capabilities of NeuralBeagle14-7B. It is designed for general text generation tasks, offering a balanced performance profile derived from its merged architecture.
Loading preview...