AurelPx/Pegasus-7b-slerp
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 22, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold
AurelPx/Pegasus-7b-slerp is a 7 billion parameter language model created by AurelPx, formed by merging ammarali32/multi_verse_model and eren23/dpo-binarized-NeutrixOmnibe-7B using a slerp method. This model leverages the strengths of its constituent models to provide a versatile base for various natural language processing tasks. It is designed for general-purpose text generation and understanding within a 4096-token context window.
Loading preview...