CEIA-POSITIVO2/Qwen-1.7B-pt-capado
TEXT GENERATIONConcurrency Cost:1Model Size:2BQuant:BF16Ctx Length:32kPublished:Feb 25, 2026Architecture:Transformer Warm

CEIA-POSITIVO2/Qwen-1.7B-pt-capado is a 2 billion parameter language model developed by CEIA-POSITIVO2. This model is based on the Qwen architecture and supports a context length of 32768 tokens. Due to the limited information in its model card, its specific differentiators and primary use cases are not explicitly detailed, but it is generally suitable for various natural language processing tasks.

Loading preview...