Gille/StrangeMerges_5-7B-ties
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 28, 2024License:apache-2.0Architecture:Transformer Open Weights Cold
Gille/StrangeMerges_5-7B-ties is a 7 billion parameter language model created by Gille, built upon the mncai/mistral-7b-dpo-v5 base model through a TIES merging of StrangeMerges_1-7B-slerp and NeuralTurdusVariant1-7B. This merge model achieves an average score of 73.89 on the Open LLM Leaderboard, demonstrating strong performance across various reasoning and language understanding benchmarks. It is designed for general-purpose language generation and understanding tasks, leveraging the combined strengths of its constituent models.
Loading preview...