luqmanxyz/LelaStarling-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 20, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

LelaStarling-7B is a 7 billion parameter language model created by luqmanxyz, formed by merging SanjiWatsuki/Lelantos-DPO-7B and berkeley-nest/Starling-LM-7B-alpha using a slerp merge method. This model is designed for general text generation tasks, leveraging the strengths of its constituent models. It achieves an average score of 71.45 on the Open LLM Leaderboard, with notable performance in reasoning and common sense benchmarks.

Loading preview...