yleo/OgnoMonarch-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 14, 2024License:cc-by-nc-4.0Architecture:Transformer0.0K Open Weights Cold
OgnoMonarch-7B is a 7 billion parameter language model developed by yleo, created by merging paulml/OGNO-7B and mlabonne/Monarch-7B using LazyMergekit. This model leverages a slerp merge method to combine the strengths of its constituent models, offering a balanced performance profile. It is designed for general text generation tasks, providing a versatile base for various natural language processing applications. The model supports a context length of 4096 tokens, making it suitable for processing moderately long inputs.
Loading preview...