Eric111/UltraCatunaMayo
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 23, 2024License:apache-2.0Architecture:Transformer Open Weights Cold

UltraCatunaMayo by Eric111 is a 7 billion parameter language model, created by merging mlabonne/UltraMerge-7B and Eric111/CatunaMayo using a slerp merge method. This model combines the strengths of its constituent models, offering a versatile base for various natural language processing tasks. Its architecture is designed for general-purpose applications, leveraging the combined knowledge of its merged components.

Loading preview...