farbodtavakkoli/OTel-LLM-14B-IT
Hugging Face
TEXT GENERATIONConcurrency Cost:1Model Size:14BQuant:FP8Ctx Length:32kPublished:Feb 11, 2026License:apache-2.0Architecture:Transformer Open Weights Warm

OTel-LLM-14B-IT by farbodtavakkoli is a 14 billion parameter, instruction-tuned language model based on Qwen3-14B, specifically fine-tuned on extensive telecommunications domain data. This model is designed to excel in telecom-specific applications, offering specialized knowledge for RAG systems and question answering on industry standards. It leverages a 32K token context length to process complex telecom specifications and documentation.

Loading preview...

OTel-LLM-14B-IT: A Specialized Telecom Language Model

OTel-LLM-14B-IT, developed by farbodtavakkoli, is a 14 billion parameter language model built upon the Qwen3-14B architecture. It is a key component of the OTel Family of Models, an initiative focused on creating open-source AI for the global telecommunications sector. This model underwent full parameter fine-tuning using a comprehensive dataset curated by over 200 domain experts from leading organizations like AT&T, GSMA, and Purdue University.

Key Capabilities

  • Telecom Domain Expertise: Specialized knowledge derived from extensive training on GSMA Permanent Reference Documents, 3GPP Specifications, O-RAN Documentation, RFC Series, and various industry whitepapers.
  • Optimized for RAG: Designed to enhance Retrieval Augmented Generation (RAG) applications within the telecommunications industry.
  • Precise Question Answering: Excels at answering questions related to complex telecom specifications and standards.
  • Robust Training: Leveraged ScalarLM framework and compute resources from TensorWave (AMD GPUs) and Azure (NVIDIA GPUs).

Good For

  • Developing RAG systems that require deep understanding of telecommunications data.
  • Automating information retrieval from telecom specifications and academic papers.
  • Building intelligent assistants for telecom professionals and researchers.

This model is licensed under Apache 2.0 and is part of a broader ecosystem including OTel Embedding and OTel Reranker models, along with related datasets.