Chickaboo/ChickaQ
TEXT GENERATIONConcurrency Cost:1Model Size:0.6BQuant:BF16Ctx Length:32kPublished:Mar 9, 2024License:mitArchitecture:Transformer0.0K Open Weights Cold

ChickaQ is a 0.6 billion parameter language model, merged from Qwen/Qwen1.5-0.5B-Chat and vilm/Quyen-SE-v0.1 using the TIES method. This model is designed for general language tasks, leveraging its compact size for efficient deployment while maintaining a substantial 32768 token context length. It offers a balanced performance profile for applications requiring a smaller, yet capable, language model.

Loading preview...