hhuihiu/ADAM-STUDIO-MAX
TEXT GENERATIONConcurrency Cost:1Model Size:1.5BQuant:BF16Ctx Length:32kPublished:Apr 10, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

The hhuihiu/ADAM-STUDIO-MAX model is an instruction-tuned 1.5 billion parameter causal language model from the Qwen2.5-Coder series, developed by Qwen. It features a 32,768 token context length and is specifically designed for code generation, code reasoning, and code fixing. This model builds upon the Qwen2.5 architecture, incorporating significant improvements in coding capabilities while maintaining strengths in mathematics and general competencies, making it suitable for real-world applications like Code Agents.

Loading preview...

Qwen2.5-Coder-1.5B-Instruct Overview

hhuihiu/ADAM-STUDIO-MAX is an instruction-tuned variant of the Qwen2.5-Coder series, a family of code-specific large language models developed by Qwen. This particular model features 1.5 billion parameters and a substantial context length of 32,768 tokens, making it well-suited for handling extensive codebases and complex programming tasks.

Key Capabilities

  • Enhanced Code Performance: Significant improvements in code generation, code reasoning, and code fixing, building on the strong Qwen2.5 foundation.
  • Broad Training Data: Trained on 5.5 trillion tokens, including source code, text-code grounding, and synthetic data, to ensure robust coding abilities.
  • Architectural Features: Utilizes a transformer architecture with RoPE, SwiGLU, RMSNorm, Attention QKV bias, and tied word embeddings.
  • General Competencies: Maintains strong performance in mathematics and general language understanding, alongside its specialized coding skills.

Good For

  • Code Agents: Provides a comprehensive foundation for developing advanced code agents due to its enhanced coding and reasoning capabilities.
  • Programming Tasks: Ideal for developers requiring assistance with generating, debugging, or understanding code across various programming languages.
  • Long Context Applications: The 32,768 token context window supports processing and generating code for larger projects or complex problem statements.