Yuma42/KangalKhan-Ruby-7B-Fixed
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 16, 2024License:apache-2.0Architecture:Transformer Open Weights Cold

KangalKhan-Ruby-7B-Fixed by Yuma42 is a 7 billion parameter language model, merged from argilla/CapybaraHermes-2.5-Mistral-7B and argilla/distilabeled-OpenHermes-2.5-Mistral-7B using slerp. This model is designed for general-purpose conversational AI, leveraging the strengths of its base models. It achieves an average score of 68.68 on the Open LLM Leaderboard, demonstrating solid performance across various reasoning and language understanding tasks.

Loading preview...