dominicjyh/bazi
TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:Apr 10, 2026License:apache-2.0Architecture:Transformer Open Weights Cold
The dominicjyh/bazi model is a 7.6 billion parameter Qwen2-based causal language model, fine-tuned by dominicjyh. It was trained using Unsloth and Huggingface's TRL library, achieving 2x faster training. This model is suitable for general language generation tasks, leveraging its efficient fine-tuning process.
Loading preview...