paulml/DPOB-NMTOB-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 12, 2024License:cc-by-nc-4.0Architecture:Transformer Open Weights Cold

paulml/DPOB-NMTOB-7B is a 7 billion parameter language model created by paulml, resulting from a merge of eren23/dpo-binarized-NeutrixOmnibe-7B and paulml/OmniBeagleSquaredMBX-v3-7B-v2. This model leverages a slerp merge method across its 32 layers, with specific parameter weighting for self_attn and mlp components. It is designed for general text generation tasks, offering a 4096-token context window.

Loading preview...