QCRI/Fanar-2-27B-Instruct
TEXT GENERATIONConcurrency Cost:2Model Size:27BQuant:FP8Ctx Length:32kPublished:Mar 16, 2026License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

QCRI/Fanar-2-27B-Instruct is a 27 billion parameter Arabic-English instruction-tuned causal language model developed by Qatar Computing Research Institute (QCRI) at HBKU, with a context length of 32,768 tokens. Continually pretrained on 166B Arabic and English tokens, it features native Arabic reasoning traces, selective thinking mode, tool calling, and advanced hallucination mitigation. This model excels in Arabic language understanding and cultural alignment, making it highly suitable for applications requiring robust bilingual capabilities and adherence to Islamic values and Arabic culture.

Loading preview...