aqweteddy/xwin-7b_chatvec-tulu2
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Nov 25, 2023License:llama2Architecture:Transformer Open Weights Cold

The aqweteddy/xwin-7b_chatvec-tulu2 is a 7 billion parameter language model, based on the xwin7b and tulu2-ppo models, that leverages a "chat vector" approach for efficient alignment with human preferences in conversational AI. This model is specifically designed to enhance chat capabilities across various languages, including Traditional Chinese, Korean, and Simplified Chinese, by synergizing pre-existing knowledge and behaviors in LLMs. It focuses on improving instruction following, multi-turn dialogue, and reducing toxicity, making it suitable for multilingual conversational applications.

Loading preview...