Pyefuri/Qwen2.5-3B-Bahasa-Biak-Final
TEXT GENERATIONConcurrency Cost:1Model Size:3.1BQuant:BF16Ctx Length:32kPublished:Mar 29, 2026License:cc-by-4.0Architecture:Transformer Open Weights Cold

Pyefuri/Qwen2.5-3B-Bahasa-Biak-Final is a 3.1 billion parameter causal language model, fine-tuned by Pyefuri from unsloth/qwen2.5-3b-instruct-bnb-4bit. This model is specifically optimized for tasks involving the Bahasa Biak language, leveraging its 32768 token context length. It was trained using Unsloth and Huggingface's TRL library for efficient fine-tuning.

Loading preview...