farbodtavakkoli/OTel-LLM-4B-IT
Hugging Face
VISIONConcurrency Cost:1Model Size:4.3BQuant:BF16Ctx Length:32kPublished:Feb 11, 2026License:apache-2.0Architecture:Transformer Open Weights Warm

OTel-LLM-4B-IT by farbodtavakkoli is a 4.3 billion parameter, instruction-tuned language model based on Google's Gemma-3-4b-it, specialized for the telecommunications domain. Fine-tuned on extensive telecom-specific data, it excels in RAG applications and question answering related to telecom specifications and standards. This model is part of the OTel Family of Models, an open-source initiative for the global telecommunications sector.

Loading preview...

OTel-LLM-4B-IT: A Specialized Telecom Language Model

OTel-LLM-4B-IT is a 4.3 billion parameter instruction-tuned language model developed by farbodtavakkoli, built upon the google/gemma-3-4b-it base model. It is a key component of the OTel Family of Models, an open-source effort dedicated to creating industry-standard AI solutions for the global telecommunications sector.

Key Capabilities and Training

This model has undergone full parameter fine-tuning using a high-quality, curated dataset focused exclusively on telecommunications. The training data was compiled with input from over 200 domain experts from leading organizations and universities, ensuring deep specialization. Data sources include:

  • GSMA Permanent Reference Documents
  • 3GPP Specifications
  • O-RAN Documentation and RFC Series
  • Industry whitepapers and academic papers covering eSIM, terminals, security, networks, roaming, and APIs.

Intended Use Cases

OTel-LLM-4B-IT is specifically optimized for applications within the telecommunications industry:

  • RAG applications: Enhancing retrieval-augmented generation systems with telecom-specific knowledge.
  • Question Answering: Providing accurate answers to queries based on complex telecom specifications and standards.

This model offers a specialized solution for developers and researchers working with telecommunications data, leveraging its domain-specific training to outperform general-purpose LLMs in this niche.