paulml/NMTOB-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 12, 2024License:cc-by-nc-4.0Architecture:Transformer Open Weights Cold

paulml/NMTOB-7B is a 7 billion parameter language model created by paulml, resulting from a slerp merge of Kukedlc/NeuTrixOmniBe-7B-model-remix and paulml/OmniBeagleSquaredMBX-v3-7B-v2. This merge combines the strengths of its constituent models, utilizing a specific parameter weighting for self-attention and MLP layers. It is designed for general text generation tasks, leveraging a 4096-token context length.

Loading preview...