mlabonne/Marcoro14-7B-slerp
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Dec 29, 2023License:cc-by-nc-4.0Architecture:Transformer0.0K Open Weights Warm

mlabonne/Marcoro14-7B-slerp is a 7 billion parameter language model merged from AIDC-ai-business/Marcoroni-7B-v3 and EmbeddedLLM/Mistral-7B-Merge-14-v0.1 using the slerp method. This model demonstrates strong performance across various benchmarks, notably achieving a high rank on the Open LLM Leaderboard for 7B models. It excels in reasoning and general knowledge tasks, making it suitable for applications requiring robust analytical capabilities.

Loading preview...