seyf1elislam/KuTrix-7b
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 14, 2024License:cc-by-nc-4.0Architecture:Transformer0.0K Open Weights Cold

KuTrix-7b is a 7 billion parameter language model developed by seyf1elislam, created by merging mistralai/Mistral-7B-v0.1 with SanjiWatsuki/Kunoichi-DPO-v2-7B and CultriX/NeuralTrix-7B-dpo using the DARE TIES method. This model achieves an average score of 74.42 on the Open LLM Leaderboard, demonstrating strong performance across various reasoning and language understanding tasks. With a 4096-token context length, it is suitable for general-purpose text generation and conversational AI applications.

Loading preview...