my-ai-stack/Stack-4.0-Qwen-3B-Merged

TEXT GENERATIONConcurrency Cost:1Model Size:3.1BQuant:BF16Ctx Length:32kPublished:Apr 26, 2026License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

my-ai-stack/Stack-4.0-Qwen-3B-Merged is a 3.1 billion parameter instruction-tuned coding model, fully merged from Qwen2.5-Coder-3B-Instruct. Developed by my-ai-stack, this model incorporates 55,000 agentic tool-use conversations, making it specialized for generating code and handling agentic tasks. It operates as a standalone model without requiring adapters and supports a 32768 token context length.

Loading preview...

Stack 4.0 Omni-Nexus — Merged

This model, my-ai-stack/Stack-4.0-Qwen-3B-Merged, is a 3-billion parameter instruction-tuned coding model. It is fully merged from Qwen2.5-Coder-3B-Instruct and includes 55,000 agentic tool-use conversations, enabling it to generate tool calls for agentic workflows.

Key Capabilities

  • Coding Focus: Specifically designed and fine-tuned for code generation tasks.
  • Agentic Tool-Use: Incorporates extensive training on agentic tool-use conversations.
  • Standalone Deployment: Provided as a fully merged model, eliminating the need for LoRA adapters or base model dependencies, allowing direct deployment.
  • Performance: Achieves 74.0% on HellaSwag (acc_norm) and 52.0% on ARC-Challenge (acc_norm) based on 50-sample evaluations, with internal coding samples producing valid Python.

Training Details

The model was trained using a QLoRA method, then merged, with 7.3 million trainable parameters (0.24% of 3.1B). Training involved 1,000 steps over approximately 10 hours on GCP Tesla V100 16GB hardware, with a final training loss of 0.1411.

Limitations

  • Size: As a 3B model, it may be less capable for complex, multi-step reasoning compared to larger models.
  • Language Optimization: Primarily optimized for English; performance in other languages may vary.
  • Tool Execution: While it generates tool calls, actual execution requires an external agent loop in the application.