mlabonne/Darewin-7B-v2
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 24, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

mlabonne/Darewin-7B-v2 is a 7 billion parameter language model created by mlabonne, merged from seven different Mistral-based models using the dare_ties method. This model combines the strengths of its constituent models to offer improved general performance across various benchmarks. It is designed for diverse natural language processing tasks, leveraging a 4096-token context length.

Loading preview...