jana-ashraf-ai/python-assistant

Hugging Face
TEXT GENERATIONConcurrency Cost:1Model Size:1.5BQuant:BF16Ctx Length:32kPublished:Mar 25, 2026License:apache-2.0Architecture:Transformer0.0K Open Weights Warm

The jana-ashraf-ai/python-assistant is a 1.5 billion parameter causal language model, fine-tuned from Qwen2.5-1.5B-Instruct by jana-ashraf-ai. This model specializes in answering Python programming questions, accepting English queries and providing structured JSON responses in Arabic. It is optimized for developers seeking Python solutions with detailed explanations in Arabic.

Loading preview...

Model Overview

The jana-ashraf-ai/python-assistant is a specialized language model developed by jana-ashraf-ai, fine-tuned from the Qwen2.5-1.5B-Instruct base model. Its primary function is to assist users with Python programming queries.

Key Capabilities

  • Python Programming Assistance: Designed to understand and respond to Python-related questions.
  • Multilingual Output: Accepts questions in English and generates detailed, step-by-step solutions in Arabic.
  • Structured JSON Output: Provides answers formatted as structured JSON, making it easy for programmatic use.
  • Fine-tuned for Specificity: Utilizes QLoRA fine-tuning on a curated dataset of 1,000 Python code instructions, enhancing its ability to provide relevant Arabic explanations.

Training Details

The model was fine-tuned using QLoRA (LoRA rank=32) via LLaMA-Factory, leveraging a subset of the iamtarun/python_code_instructions_18k_alpaca dataset. The training involved 3 epochs with a learning rate of 1e-4, using 4-bit quantization (nf4) on a Google Colab T4 GPU.

When to Use This Model

This model is ideal for developers or learners who:

  • Need clear, structured answers to Python programming questions.
  • Require explanations and solutions specifically in Arabic.
  • Are looking for a model optimized for Python code instruction rather than general-purpose tasks.

Limitations

  • Language Specificity: Answers are exclusively in Arabic.
  • Domain Specificity: Optimized solely for Python programming questions.
  • Model Size: As a 1.5B parameter model, it may face challenges with highly complex or nuanced programming problems.