WorldOpenTechnology/Araptor-1
TEXT GENERATIONConcurrency Cost:1Model Size:4BQuant:BF16Ctx Length:32kPublished:Feb 1, 2026License:apache-2.0Architecture:Transformer Open Weights Warm

WorldOpenTechnology/Araptor-1 is a 4 billion parameter Qwen3-based instruction-tuned causal language model developed by WorldOpenTechnology. This model was finetuned using Unsloth and Huggingface's TRL library, emphasizing efficient training. It is designed for general instruction-following tasks, leveraging its Qwen3 architecture for robust performance.

Loading preview...

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p