mlabonne/Daredevil-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 6, 2024License:cc-by-nc-4.0Architecture:Transformer0.0K Open Weights Cold

Daredevil-7B is a 7 billion parameter language model developed by mlabonne, created by merging three Mistral-7B-based models using LazyMergekit. This model demonstrates strong performance across various benchmarks, including AGIEval, GPT4All, and TruthfulQA, making it suitable for general-purpose reasoning and question-answering tasks. With a 4096-token context length, it offers a balanced solution for applications requiring robust language understanding and generation.

Loading preview...