RJTPP/scot0500s-deepseek-1.5b-full

TEXT GENERATIONConcurrency Cost:1Model Size:1.5BQuant:BF16Ctx Length:32kPublished:Apr 21, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

RJTPP/scot0500s-deepseek-1.5b-full is a 1.5 billion parameter Qwen2-based causal language model developed by RJTPP. This model was fine-tuned using Unsloth and Huggingface's TRL library, enabling faster training. It is designed for general language tasks, leveraging its efficient training methodology.

Loading preview...

Model Overview

RJTPP/scot0500s-deepseek-1.5b-full is a 1.5 billion parameter language model based on the Qwen2 architecture. It was developed by RJTPP and fine-tuned from unsloth/DeepSeek-R1-Distill-Qwen-1.5B-unsloth-bnb-4bit.

Key Characteristics

  • Efficient Fine-tuning: This model was fine-tuned using Unsloth and Huggingface's TRL library, which allowed for a 2x faster training process compared to standard methods.
  • Base Model: Built upon the DeepSeek-R1-Distill-Qwen-1.5B base, indicating a foundation optimized for performance within its parameter class.
  • License: Distributed under the Apache-2.0 license, providing broad usage permissions.

Use Cases

This model is suitable for a variety of general language generation and understanding tasks where a compact yet efficiently trained model is beneficial. Its efficient fine-tuning process suggests it could be a good candidate for applications requiring rapid iteration or deployment on resource-constrained environments.