paulml/OmniBeagleSquaredMBX-v3-7B-v2
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 9, 2024License:cc-by-nc-4.0Architecture:Transformer0.0K Open Weights Cold
OmniBeagleSquaredMBX-v3-7B-v2 by paulml is a 7 billion parameter language model created by merging paulml/OmniBeagleMBX-v3-7B and flemmingmiguel/MBX-7B-v3 using LazyMergekit. This model leverages a slerp merge method with specific parameter weighting for self_attn and mlp layers, aiming to combine the strengths of its constituent models. It is designed for general text generation tasks, offering a 4096-token context window.
Loading preview...