Azazelle/Sina-Odin-7b-Merge
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 11, 2024License:cc-by-4.0Architecture:Transformer Open Weights Cold

Azazelle/Sina-Odin-7b-Merge is a 7 billion parameter experimental language model created by Azazelle, developed using a DARE merge of several base models including Mihaiii/Metis-0.3. This model is designed for general language tasks, demonstrating an average performance of 47.82 on the Open LLM Leaderboard across various benchmarks. It is suitable for applications requiring a compact yet capable model for reasoning and common sense understanding.

Loading preview...