peremayolc/qwen-final-1-5
TEXT GENERATIONConcurrency Cost:1Model Size:1.5BQuant:BF16Ctx Length:32kPublished:Feb 4, 2026Architecture:Transformer Warm

The peremayolc/qwen-final-1-5 is a 1.5 billion parameter language model with a 131072 token context length. This model is based on the Qwen architecture, designed for general language understanding and generation tasks. Its large context window makes it suitable for processing extensive documents and complex conversational histories. It aims to provide a versatile foundation for various NLP applications.

Loading preview...