mlabonne/Marcoro14-7B-slerp is a 7 billion parameter language model merged from AIDC-ai-business/Marcoroni-7B-v3 and EmbeddedLLM/Mistral-7B-Merge-14-v0.1 using the slerp method. This model demonstrates strong performance across various benchmarks, notably achieving a high rank on the Open LLM Leaderboard for 7B models. It excels in reasoning and general knowledge tasks, making it suitable for applications requiring robust analytical capabilities.
No reviews yet. Be the first to review!