Qwen/Qwen2.5-14B is a 14.7 billion parameter causal language model developed by Qwen, featuring a 131,072 token context length. This base model, part of the Qwen2.5 series, significantly improves upon Qwen2 with enhanced knowledge, coding, and mathematics capabilities, alongside better instruction following and long-text generation. It is designed for pretraining and further fine-tuning, offering multilingual support for over 29 languages.
No reviews yet. Be the first to review!