mvpmaster/pmmpk-EinstainMorcoro14KrishnaHercules-7b-slerp
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 19, 2024License:apache-2.0Architecture:Transformer Open Weights Cold
The mvpmaster/pmmpk-EinstainMorcoro14KrishnaHercules-7b-slerp is a 7 billion parameter language model created by mvpmaster through a slerp merge of mvpmaster/kellemar-KrishnaHercules-0.1-7b-slerp and mvpmaster/Einstein-4D-Marcoro14-7b-full-slerp. This model leverages the combined strengths of its base models, offering a general-purpose language understanding and generation capability. It is suitable for a variety of text-based AI applications requiring a 7B parameter model with a 4096 token context length.
Loading preview...