mlabonne/NeuralHermes-2.5-Mistral-7B-laser
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 4, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

mlabonne/NeuralHermes-2.5-Mistral-7B-laser is a 7 billion parameter Mistral-based causal language model, fine-tuned with Direct Preference Optimization (DPO) using the mlabonne/chatml_dpo_pairs dataset. This experimental version incorporates Layer-Selective Rank Reduction (LASER) technology, aiming to optimize performance. It is designed for general conversational AI tasks, demonstrating improved benchmark scores over its base model.

Loading preview...