farbodtavakkoli/OTel-LLM-0.6B-IT
TEXT GENERATIONConcurrency Cost:1Model Size:0.8BQuant:BF16Ctx Length:32kPublished:Feb 11, 2026License:apache-2.0Architecture:Transformer Open Weights Loading

OTel-LLM-0.6B-IT by farbodtavakkoli is a 0.6 billion parameter language model, fine-tuned from Qwen3-0.6B, specifically designed for the telecommunications domain. It excels at question answering and RAG applications within telecom specifications and standards. This model was trained on high-quality, expert-curated data including GSMA, 3GPP, and O-RAN documentation, making it specialized for industry-specific tasks.

Loading preview...

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p