Lamapi/next-14b

Cold
Public
14B
FP8
32768
License: mit
Hugging Face
Overview

Overview

Lamapi/next-14b is a 14-billion parameter large language model (LLM) based on the Qwen 3 architecture, specifically designed for superior reasoning and analytical capabilities. It stands out as Türkiye’s first AI model focused on thinking, inferring, and decision-making rather than just responding. Unlike vision-based models, Next 14B prioritizes cognitive performance, demonstrating mastery in complex problem-solving and abstract logic across Turkish and English.

Key Capabilities

  • Advanced Reasoning: Excels in abstract logic, critical thinking, and long-form analysis.
  • Multilingual Understanding: Offers deep understanding in Turkish and fluent English, supporting over 30 languages.
  • Mathematical & Analytical Skill: Performs exceptionally in structured problem-solving and scientific reasoning.
  • Enterprise Reliability: Provides consistent, interpretable outputs suitable for professional use cases.

Benchmark Performance

Next 14B demonstrates strong performance in reasoning benchmarks:

  • MMLU (5-shot): 94.6%
  • MMLU-Pro: 93.2%
  • GSM8K: 98.8%
  • MATH: 92.7%

Ideal Use Cases

  • Analytical Chatbots: For business and enterprise logic.
  • Research Assistance: Supporting scientific, legal, or data-heavy reasoning.
  • Education & Tutoring: Explaining concepts step-by-step.
  • Decision Support Systems: Evaluating scenarios and making inferences.