paudelnirajan/seqkd-Qwen2.5-7B-Instruct-Qwen2.5-0.5B-Instruct-npi-2766
TEXT GENERATIONConcurrency Cost:1Model Size:0.5BQuant:BF16Ctx Length:32kPublished:Mar 30, 2026Architecture:Transformer Loading
The paudelnirajan/seqkd-Qwen2.5-7B-Instruct-Qwen2.5-0.5B-Instruct-npi-2766 is a 0.5 billion parameter instruction-tuned language model. This model is based on the Qwen2.5 architecture, designed for general language understanding and generation tasks. Its compact size makes it suitable for applications requiring efficient inference and deployment on resource-constrained environments. Further details on its specific training, capabilities, and differentiators are not provided in the available model card.
Loading preview...