mncai/mistral-7b-dpo-v5
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Dec 14, 2023License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

mncai/mistral-7b-dpo-v5 is a 7 billion parameter language model developed by MindsAndCompany, based on the Mistral architecture. It has been instruction-tuned and further optimized using Direct Preference Optimization (DPO). This model is designed for general conversational tasks, supporting a context length of 4096 tokens.

Loading preview...