dvilasuero/NeuralHermes-2.5-Mistral-7B-distilabel
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 8, 2024License:mitArchitecture:Transformer0.0K Open Weights Cold

dvilasuero/NeuralHermes-2.5-Mistral-7B-distilabel is a 7 billion parameter language model based on the Mistral architecture, fine-tuned using the distilabel framework. This model is specifically trained on the argilla/distilabel-intel-orca-dpo-pairs dataset, focusing on improving response quality through DPO. It is designed for tasks requiring high-quality, instruction-following text generation, leveraging its specialized training for enhanced conversational abilities.

Loading preview...