farbodtavakkoli/OTel-LLM-20B-Reasoning
Hugging Face
TEXT GENERATIONConcurrency Cost:2Model Size:20BQuant:FP8Ctx Length:32kPublished:Feb 15, 2026License:apache-2.0Architecture:Transformer Open Weights Warm

OTel-LLM-20B-Reasoning is a 20 billion parameter, English-language model developed by farbodtavakkoli, fine-tuned from openai/gpt-oss-20b. This model is specialized for the telecommunications domain, optimized for RAG applications and question answering on telecom specifications and standards. It was trained on high-quality, curated telecom-focused data, including GSMA, 3GPP, and O-RAN documentation.

Loading preview...

OTel-LLM-20B-Reasoning: Telecom-Specialized Language Model

OTel-LLM-20B-Reasoning is a 20 billion parameter language model developed by farbodtavakkoli, specifically fine-tuned for the telecommunications sector. Built upon the openai/gpt-oss-20b base model, it is part of the OTel Family of Models, an open-source initiative aimed at providing industry-standard AI for global telecommunications.

Key Capabilities & Training

  • Domain Expertise: Specialized in telecommunications, trained on extensive, high-quality data curated by over 200 domain experts from leading organizations like AT&T, GSMA, and Purdue University.
  • Comprehensive Data Sources: Training data includes GSMA Permanent Reference Documents, 3GPP Specifications, O-RAN Documentation, RFC Series, and industry whitepapers covering eSIM, networks, security, and more.
  • Full Parameter Fine-tuning: Utilizes full parameter fine-tuning for enhanced domain-specific performance.

Intended Use Cases

  • RAG Applications: Optimized for Retrieval Augmented Generation (RAG) within the telecommunications domain.
  • Question Answering: Excels at answering questions related to telecom specifications, standards, and technical documentation.

This model is ideal for developers and researchers building AI solutions that require deep understanding and generation capabilities within the complex telecommunications landscape.