paulml/OmniBeagleMBX-v3-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 4, 2024License:cc-by-nc-4.0Architecture:Transformer0.0K Open Weights Cold
OmniBeagleMBX-v3-7B is a 7 billion parameter language model created by paulml, resulting from a slerp merge of mlabonne/OmniBeagle-7B and flemmingmiguel/MBX-7B-v3. This model leverages the strengths of its constituent models, offering a balanced performance profile for general text generation tasks within a 4096 token context window. It is designed for developers seeking a merged model with specific parameter weighting for self-attention and MLP layers.
Loading preview...