CultriX/Qwen2.5-14B-HyperMarck-dl
TEXT GENERATIONConcurrency Cost:1Model Size:14.8BQuant:FP8Ctx Length:32kPublished:Feb 16, 2025License:apache-2.0Architecture:Transformer Open Weights Cold

CultriX/Qwen2.5-14B-HyperMarck-dl is a 14.8 billion parameter language model created by CultriX, based on the Qwen2.5 architecture. This model is a merge of pre-trained language models, specifically utilizing the Linear DELLA merge method with suayptalha/Lamarckvergence-14B as its base. It integrates components from CultriX/MergeStage1v3 and CultriX/MergeStage2v3, offering a combined set of capabilities from its constituent models. With a 32768 token context length, it is designed for general language understanding and generation tasks.

Loading preview...