paulml/NeuralOmniBeagleMBX-v3-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 5, 2024License:cc-by-nc-4.0Architecture:Transformer0.0K Open Weights Cold
NeuralOmniBeagleMBX-v3-7B by paulml is a 7 billion parameter merged language model, combining mlabonne/NeuralOmniBeagle-7B and flemmingmiguel/MBX-7B-v3. This model utilizes a slerp merge method with specific parameter weighting for self-attention and MLP layers, offering a distinct blend of capabilities from its constituent models. It is designed for general text generation tasks, leveraging its merged architecture for potentially enhanced performance over its individual components.
Loading preview...