farbodtavakkoli/OTel-LLM-270M-IT
Hugging Face
TEXT GENERATIONConcurrency Cost:1Model Size:0.27BQuant:BF16Ctx Length:32kPublished:Feb 11, 2026License:apache-2.0Architecture:Transformer Open Weights Warm

The farbodtavakkoli/OTel-LLM-270M-IT is a 270 million parameter language model, fine-tuned from google/gemma-3-270m-it, specifically for the telecommunications domain. Developed by Farbod Tavakkoli as part of the OTel Family of Models, it is optimized for telecom-specific RAG applications and question answering. This model excels at processing and understanding information from telecommunications specifications, standards, and industry documentation.

Loading preview...

OTel-LLM-270M-IT: Telecom-Specialized Language Model

OTel-LLM-270M-IT is a 270 million parameter language model, fine-tuned from the google/gemma-3-270m-it base model. It is part of the OTel Family of Models, an open-source initiative focused on developing AI models for the global telecommunications sector. This model underwent full parameter fine-tuning using a high-quality, English-language dataset curated by over 200 domain experts from leading organizations like AT&T, GSMA, and Purdue University.

Key Capabilities

  • Domain-Specific Understanding: Specialized in telecommunications, trained on data including GSMA Permanent Reference Documents, 3GPP Specifications, O-RAN Documentation, and RFC Series.
  • RAG Applications: Optimized for Retrieval Augmented Generation (RAG) within the telecom industry.
  • Question Answering: Designed to accurately answer questions based on telecom specifications, standards, and industry whitepapers.

Intended Use Cases

  • Information Retrieval: Extracting specific details from vast telecommunications documentation.
  • Technical Support: Assisting with queries related to telecom standards, eSIM, networks, and APIs.
  • Research and Development: Aiding in understanding complex telecom concepts and specifications.