dominicjyh/bazi

TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:Apr 10, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

The dominicjyh/bazi model is a 7.6 billion parameter Qwen2-based causal language model, fine-tuned by dominicjyh. It was trained using Unsloth and Huggingface's TRL library, achieving 2x faster training. This model is suitable for general language generation tasks, leveraging its efficient fine-tuning process.

Loading preview...

Model Overview

The dominicjyh/bazi model is a 7.6 billion parameter language model based on the Qwen2 architecture. It was developed by dominicjyh and fine-tuned from the unsloth/deepseek-r1-distill-qwen-7b-unsloth-bnb-4bit base model.

Key Characteristics

  • Architecture: Qwen2-based causal language model.
  • Parameter Count: 7.6 billion parameters.
  • Training Efficiency: Fine-tuned with Unsloth and Huggingface's TRL library, resulting in a 2x faster training process compared to standard methods.
  • License: Released under the Apache-2.0 license.

Potential Use Cases

This model is suitable for various natural language processing tasks where a Qwen2-based model with efficient fine-tuning is beneficial. Its 7.6 billion parameters and 32768 token context length make it capable of handling diverse language generation and understanding applications.