farbodtavakkoli/OTel-LLM-27B-IT

Hugging Face
VISIONConcurrency Cost:2Model Size:27BQuant:FP8Ctx Length:32kPublished:Feb 11, 2026License:apache-2.0Architecture:Transformer Open Weights Warm

OTel-LLM-27B-IT by farbodtavakkoli is a 27 billion parameter language model fine-tuned from google/gemma-3-27b-it, specifically optimized for the telecommunications domain. Trained on extensive telecom-focused data curated by industry experts, it excels in Retrieval Augmented Generation (RAG) applications and question answering related to telecom specifications and standards. This model is part of the OTel Family, an open-source initiative for AI in the global telecommunications sector, and supports a 32768 token context length.

Loading preview...

Overview

OTel-LLM-27B-IT is a 27 billion parameter instruction-tuned language model developed by farbodtavakkoli, built upon the google/gemma-3-27b-it base model. It is a specialized model within the OTel Family of Models, an open-source effort dedicated to creating AI solutions for the global telecommunications industry. The model underwent full parameter fine-tuning using a comprehensive dataset focused on telecommunications.

Key Capabilities

  • Telecom Domain Expertise: Fine-tuned on high-quality data from sources like GSMA Permanent Reference Documents, 3GPP Specifications, O-RAN Documentation, RFC Series, and various industry whitepapers and academic papers.
  • RAG Optimization: Specifically designed to enhance Retrieval Augmented Generation (RAG) applications within the telecommunications sector.
  • Question Answering: Proficient in answering questions pertaining to complex telecom specifications and industry standards.

Intended Use Cases

  • RAG applications in telecommunications.
  • Question answering on telecom specifications and standards.

Training Details

The model was trained using the ScalarLM framework on a combination of TensorWave with AMD GPUs and Azure with NVIDIA GPUs. The training data was curated by over 200 domain experts from leading organizations including AT&T, GSMA, Purdue University, and Yale University, ensuring high relevance and accuracy for the telecom domain.