Azazelle/Yuna-7b-Merge
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 5, 2024License:cc-by-4.0Architecture:Transformer Open Weights Cold

Azazelle/Yuna-7b-Merge is an experimental 7 billion parameter language model created by Azazelle, built using a DARE merge of several 7B models including Dans-07YahooAnswers-7b, Maylin-7b, smol_bruin-7b, and Kunoichi-7B. This model is designed for general language tasks, leveraging its merged architecture to potentially enhance performance across various applications. Its 4096-token context length supports moderate input sequences for diverse use cases.

Loading preview...