Yuma42/KangalKhan-RawEmerald-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 17, 2024License:apache-2.0Architecture:Transformer Open Weights Cold

Yuma42/KangalKhan-RawEmerald-7B is a 7 billion parameter language model based on the Mistral architecture, created by Yuma42 through a merge of CapybaraHermes-2.5-Mistral-7B and distilabeled-OpenHermes-2.5-Mistral-7B. This model is designed for general conversational AI tasks, leveraging the strengths of its merged components. It achieves an average score of 69.09 on the Open LLM Leaderboard, demonstrating solid performance across various reasoning and language understanding benchmarks.

Loading preview...