Trong8223/hpt-trade-ai-v2

TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:Apr 15, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

Trong8223/hpt-trade-ai-v2 is a 7.6 billion parameter Qwen2.5-based instruction-tuned language model developed by Trong8223, fine-tuned from unsloth/Qwen2.5-7B-Instruct-bnb-4bit. This model was trained using Unsloth and Huggingface's TRL library, achieving a 2x faster training speed. Its primary application is in areas benefiting from efficient Qwen2.5 instruction following.

Loading preview...

Model Overview

Trong8223/hpt-trade-ai-v2 is a 7.6 billion parameter instruction-tuned language model, developed by Trong8223. It is based on the Qwen2.5 architecture, specifically fine-tuned from the unsloth/Qwen2.5-7B-Instruct-bnb-4bit model. This model leverages the Unsloth library in conjunction with Huggingface's TRL library, which enabled a 2x faster training process compared to standard methods.

Key Characteristics

  • Architecture: Qwen2.5-based, instruction-tuned.
  • Parameter Count: 7.6 billion parameters.
  • Context Length: Supports a context length of 32768 tokens.
  • Training Efficiency: Utilizes Unsloth for accelerated fine-tuning, resulting in significantly faster training times.

Intended Use Cases

This model is suitable for applications requiring a capable Qwen2.5-based instruction-following model, particularly where efficient training methodologies are a priority. Its fine-tuned nature suggests strong performance in tasks aligned with general instruction following.