Yuma42/KangalKhan-RawRuby-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 17, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold
Yuma42/KangalKhan-RawRuby-7B is a 7 billion parameter language model created by Yuma42, formed by merging KangalKhan-Ruby-7B-Fixed and KangalKhan-RawEmerald-7B. This model is designed for general language tasks, leveraging a slerp merge method to combine the strengths of its constituent models. It offers a 4096-token context length and demonstrates competitive performance across various benchmarks, including reasoning and common sense tasks.
Loading preview...