mvpmaster/Einstein-4D-Marcoro14-7b-full-slerp
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 17, 2024License:apache-2.0Architecture:Transformer Open Weights Cold
Einstein-4D-Marcoro14-7b-full-slerp is a 7 billion parameter language model created by mvpmaster, formed by merging argilla/distilabeled-Marcoro14-7B-slerp-full and Weyaxi/Einstein-v4-7B using a slerp merge method. This model leverages the strengths of its constituent models, offering a combined capability for general language tasks within a 4096 token context length. Its unique merge configuration suggests a focus on balanced performance across various linguistic applications.
Loading preview...