CEIA-POSITIVO2/Qwen-1.7B-pt-capado is a 2 billion parameter language model developed by CEIA-POSITIVO2. This model is based on the Qwen architecture and supports a context length of 32768 tokens. Due to the limited information in its model card, its specific differentiators and primary use cases are not explicitly detailed, but it is generally suitable for various natural language processing tasks.
No reviews yet. Be the first to review!