PARTAGES-dev/Qwen3-1.7B-PDAPT-SLERP
TEXT GENERATIONConcurrency Cost:1Model Size:2BQuant:BF16Ctx Length:32kPublished:Dec 3, 2025Architecture:Transformer Cold

PARTAGES-dev/Qwen3-1.7B-PDAPT-SLERP is a 2 billion parameter language model based on the Qwen3-1.7B-Base architecture, created through a SLERP merge of Qwen/Qwen3-1.7B-Base and another undisclosed model. This merge aims to combine the strengths of its constituent models, offering a potentially enhanced foundation for various natural language processing tasks. With a 32768 token context length, it is suitable for applications requiring processing of longer inputs.

Loading preview...