charlesdedampierre/TopicNeuralHermes-2.5-Mistral-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 12, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

TopicNeuralHermes-2.5-Mistral-7B by charlesdedampierre is a 7 billion parameter Mistral-based language model fine-tuned using a novel topic-modeling approach on a refined DPO dataset. This model focuses on achieving strong performance with significantly less training data by identifying and leveraging distinctive topics from accepted answers. It is optimized for conversational tasks, demonstrating efficient convergence and competitive results compared to models trained on larger datasets.

Loading preview...