mayacinka/ramonda-7b-dpo-ties
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 17, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

mayacinka/ramonda-7b-dpo-ties is a 7 billion parameter language model created by mayacinka, formed by merging paulml/OGNO-7B and bardsai/jaskier-7b-dpo-v4.3 using the TIES merging method. This model achieves an average score of 76.19 on the Open LLM Leaderboard and 62.12 on LLM AutoEval, demonstrating strong performance across various reasoning and language understanding tasks. It is suitable for general-purpose applications requiring robust language generation and comprehension within a 4096-token context window.

Loading preview...