AXCXEPT/Qwen3-EZO-8B-beta is an 8-billion-parameter language model based on Qwen3-8B, developed by AXCXEPT. This model is optimized for multi-turn tasks, achieving performance comparable to larger models like Gemini 2.5 Flash and GPT-4o, with MT-Bench 9.08 and JMT-Bench 8.87 scores. It features a 32K context length and supports parallel processing of deep-thinking prompts using its 'Deep-Think' technique, making it suitable for complex reasoning tasks.
No reviews yet. Be the first to review!