yoriis/Fanar_9B-Base_IT_0.3
TEXT GENERATIONConcurrency Cost:1Model Size:9BQuant:FP8Ctx Length:16kPublished:Dec 31, 2025Architecture:Transformer Cold

Fanar_9B-Base_IT_0.3 by yoriis is a 9 billion parameter instruction-tuned language model with a 16384 token context length. This model is a base iteration, indicating it serves as a foundational model for further specialization. Its primary characteristic is its instruction-following capability, making it suitable for general-purpose conversational AI and task execution based on explicit prompts.

Loading preview...