eren23/slerp-test-turdus-beagle
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 16, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

The eren23/slerp-test-turdus-beagle is a 7 billion parameter language model created by eren23, formed by merging udkai/Turdus and mlabonne/NeuralBeagle14-7B using the slerp method. This model is based on the OpenPipe/mistral-ft-optimized-1218 architecture and achieves an average score of 75.11 on the Open LLM Leaderboard. It is designed for general language understanding and generation tasks, demonstrating strong performance across various benchmarks including reasoning and common sense.

Loading preview...