scaledown/ScaleDown-7B-slerp-v0.1
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 1, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

ScaleDown, a 7 billion parameter model by scaledown, is a slerp merge of OpenPipe/mistral-ft-optimized-1218 and jondurbin/bagel-dpo-7b-v0.1, designed to combine their respective strengths. This model achieves an average score of 71.57 on the Open LLM Leaderboard, demonstrating strong performance across various reasoning and language understanding tasks. With a 4096-token context length, it is suitable for general-purpose applications requiring robust language generation and comprehension.

Loading preview...