isnainink90/qwen25-ppn-ppnbm-merged-model
TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:Mar 26, 2026Architecture:Transformer Cold

The isnainink90/qwen25-ppn-ppnbm-merged-model is a 7.6 billion parameter language model, fine-tuned from Qwen/Qwen2.5-7B-Instruct. This model is designed for general language tasks, leveraging its Qwen2.5 base architecture. With a substantial 32768 token context length, it is suitable for applications requiring extensive contextual understanding and generation. Its primary strength lies in its foundation on the Qwen2.5 series, indicating robust performance across various natural language processing challenges.

Loading preview...