leveldevai/TurdusDareBeagle-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 18, 2024License:apache-2.0Architecture:Transformer Open Weights Cold

leveldevai/TurdusDareBeagle-7B is a 7 billion parameter language model, created by merging uakai/Turdus and shadowml/DareBeagle-7B using LazyMergekit. This model leverages a slerp merge method to combine the strengths of its base models, offering a balanced performance profile. It is designed for general-purpose text generation tasks, utilizing a 4096-token context window.

Loading preview...