aqweteddy/mistral_tv-neural-marconroni
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Dec 29, 2023License:mitArchitecture:Transformer0.0K Open Weights Cold

aqweteddy/mistral_tv-neural-marconroni is a 7 billion parameter language model based on Mistral 7B, enhanced with a "chat vector" approach for improved conversational capabilities, particularly in non-English languages like Traditional Chinese, Korean, and Simplified Chinese. This model leverages a novel method to efficiently align LLMs with human preferences across various languages, focusing on instruction following and multi-turn dialogue. It achieves an average score of 71.27 on the Open LLM Leaderboard, demonstrating strong performance in reasoning and common sense tasks.

Loading preview...