duttaturja/MLPredic-4B
TEXT GENERATIONConcurrency Cost:1Model Size:4BQuant:BF16Ctx Length:32kLicense:apache-2.0Architecture:Transformer Open Weights Warm

The duttaturja/MLPredic-4B is a 4 billion parameter Qwen3 causal language model, developed by duttaturja and fine-tuned from unsloth/qwen3-4b-unsloth-bnb-4bit. This model was trained using Unsloth and Huggingface's TRL library, achieving 2x faster training. It features a 40960 token context length, making it suitable for applications requiring efficient processing of long sequences.

Loading preview...