flemmingmiguel/MDBX-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 21, 2024License:apache-2.0Architecture:Transformer Open Weights Cold
MDBX-7B is a 7 billion parameter language model created by flemmingmiguel, formed by merging leveldevai/MarcDareBeagle-7B and leveldevai/MarcBeagle-7B using LazyMergekit. This model leverages a slerp merge method across its 32 layers, with specific parameter weighting for self_attn and mlp components. It is designed for general language generation tasks, offering a 4096-token context length.
Loading preview...