allknowingroger/QwenSlerp5-14B
TEXT GENERATIONConcurrency Cost:1Model Size:14.8BQuant:FP8Ctx Length:32kPublished:Nov 27, 2024Architecture:Transformer0.0K Cold

allknowingroger/QwenSlerp5-14B is a 14.8 billion parameter language model created by allknowingroger using a SLERP merge of CultriX/Qwestion-14B and CultriX/SeQwence-14Bv1. This model leverages a V-shaped curve configuration during merging, optimizing for specific layer contributions from its base models. It is designed for general language tasks, with evaluation results available on the Open LLM Leaderboard.

Loading preview...