automerger/OgnoExperiment27-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 8, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

automerger/OgnoExperiment27-7B is a 7 billion parameter language model created by automerger using the SLERP merge method. This model combines eren23/ogno-monarch-jaskier-merge-7b-OH-PREF-DPO-v2 and yam-peleg/Experiment27-7B, leveraging their combined strengths. It is designed for general language generation tasks, benefiting from the merged architectures of its constituent models.

Loading preview...