farbodtavakkoli/OTel-LLM-8.2B-IT
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Feb 11, 2026License:apache-2.0Architecture:Transformer Open Weights Warm
OTel-LLM-8.2B-IT is an 8.2 billion parameter instruction-tuned language model developed by farbodtavakkoli, based on the Qwen3-8B architecture. This model is specifically fine-tuned on high-quality telecommunications domain data, curated by over 200 domain experts. It excels in RAG applications and question answering within the telecom sector, making it ideal for processing and understanding telecommunications specifications and standards.
Loading preview...
Popular Sampler Settings
Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.
temperature
–
top_p
–
top_k
–
frequency_penalty
–
presence_penalty
–
repetition_penalty
–
min_p
–