flemmingmiguel/MBX-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 21, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

MBX-7B by flemmingmiguel is a 7 billion parameter language model created by merging leveldevai/MarcDareBeagle-7B and leveldevai/MarcBeagle-7B using LazyMergekit. This model leverages a slerp merge method across its 32 layers, with specific parameter weighting for self_attn and mlp components. It is designed for general text generation tasks, offering a 4096-token context window.

Loading preview...