Azazelle/xDAN-SlimOrca
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Dec 29, 2023License:cc-by-4.0Architecture:Transformer Open Weights Cold

Azazelle/xDAN-SlimOrca is a 7 billion parameter language model created by Azazelle, formed by a slerp merge of xDAN-L1-Chat-RL-v1 and mistral-7b-slimorcaboros, based on the Mistral-7B-v0.1 architecture. This model is designed for general conversational tasks, leveraging its merged base models to achieve a balanced performance across various benchmarks. It features a 4096-token context length and demonstrates an average score of 68.04 on the Open LLM Leaderboard.

Loading preview...