par0v0z/oracul-1.7b
TEXT GENERATIONConcurrency Cost:1Model Size:2BQuant:BF16Ctx Length:32kPublished:Mar 6, 2026License:apache-2.0Architecture:Transformer Open Weights Warm

The par0v0z/oracul-1.7b is a 2 billion parameter Qwen3-based causal language model developed by par0v0z. This model was fine-tuned using Unsloth and Huggingface's TRL library, enabling faster training. It is optimized for general language tasks, leveraging its Qwen3 architecture for efficient performance.

Loading preview...