farbodtavakkoli/OTel-LLM-12B-IT

Hugging Face
VISIONConcurrency Cost:1Model Size:12BQuant:FP8Ctx Length:32kPublished:Feb 11, 2026License:apache-2.0Architecture:Transformer Open Weights Warm

OTel-LLM-12B-IT by farbodtavakkoli is a 12 billion parameter instruction-tuned language model based on Google's Gemma-3-12b-it, specifically fine-tuned for the telecommunications domain. It was trained on extensive telecom-focused data, including GSMA, 3GPP, and O-RAN specifications, curated by over 200 domain experts. This model is optimized for RAG applications and question answering within the telecommunications sector, providing specialized knowledge for industry-specific tasks.

Loading preview...

OTel-LLM-12B-IT: A Specialized Telecom Language Model

OTel-LLM-12B-IT is a 12 billion parameter instruction-tuned language model developed by farbodtavakkoli, built upon Google's Gemma-3-12b-it base model. It is a key component of the OTel Family of Models, an initiative focused on creating open-source AI solutions for the global telecommunications industry.

Key Capabilities & Training

This model underwent full parameter fine-tuning using a high-quality, telecom-specific dataset. This data was meticulously curated by over 200 domain experts from various organizations, including AT&T, GSMA, Purdue University, and Yale University. The training data encompasses a wide range of telecommunications documentation, such as:

  • GSMA Permanent Reference Documents
  • 3GPP Specifications
  • O-RAN Documentation
  • RFC Series
  • Industry whitepapers and academic papers covering eSIM, security, networks, and more.

Intended Use Cases

OTel-LLM-12B-IT is specifically optimized for applications requiring deep knowledge of telecommunications. It excels in:

  • RAG applications within the telecommunications domain.
  • Question answering based on telecom specifications and industry standards.

This model is designed to provide accurate and relevant information for specialized telecom tasks, leveraging its extensive domain-specific training.