paulml/DPOB-INMTOB-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 12, 2024License:cc-by-nc-4.0Architecture:Transformer0.0K Open Weights Cold

paulml/DPOB-INMTOB-7B is a 7 billion parameter language model created by paulml, formed by merging liminerity/Omningotex-7b-slerp and eren23/merged-dpo-binarized-NeutrixOmnibe-7B. This model utilizes a slerp merge method and supports a 4096 token context length. It demonstrates strong general reasoning capabilities, achieving an average score of 76.21 on the Open LLM Leaderboard across various benchmarks.

Loading preview...