Hercules-Qwen1.5-14B is a 14.2 billion parameter language model developed by M4-ai, fine-tuned from Qwen1.5-14B. It is optimized for a broad range of tasks including math, coding, function calling, and roleplay, utilizing 700,000 examples from the Hercules-v4 dataset. This model offers general-purpose assistant capabilities and supports a context length of 32768 tokens.
No reviews yet. Be the first to review!