Artples/L-MChat-7b
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Apr 2, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Warm

Artples/L-MChat-7b is a 7 billion parameter language model created by Artples, formed by merging Nexusflow/Starling-LM-7B-beta and FuseAI/FuseChat-7B-VaRM. This model leverages a slerp merge method to combine the strengths of its base models, offering a balanced performance across various benchmarks. It is designed for general conversational AI tasks and demonstrates competitive results on the Open LLM Leaderboard.

Loading preview...