hotmailuser/QwenSlerp2-14B
TEXT GENERATIONConcurrency Cost:1Model Size:14.8BQuant:FP8Ctx Length:32kPublished:Jan 5, 2025License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

The hotmailuser/QwenSlerp2-14B is a 14.8 billion parameter language model created by hotmailuser through a SLERP merge of sometimesanotion/Lamarck-14B-v0.6 and bamec66557/Qwen-2.5-14B-MINUS. This model leverages the strengths of its constituent models, with a context length of 32768 tokens, making it suitable for tasks requiring robust language understanding and generation. Its unique merging configuration, which applies a V-shaped curve to parameters, suggests a specialized balance between the base models for different layers.

Loading preview...