farbodtavakkoli/OTel-LLM-8.3B-IT
Hugging Face
TEXT GENERATIONConcurrency Cost:1Model Size:8.3BQuant:FP8Ctx Length:32kPublished:Feb 11, 2026License:apache-2.0Architecture:Transformer0.0K Open Weights Warm

OTel-LLM-8.3B-IT by farbodtavakkoli is an 8.3 billion parameter language model fine-tuned specifically for the telecommunications domain. Built on the EssentialAI/rnj-1-instruct base, it excels at question answering and RAG applications within telecom specifications and standards. This model was trained on high-quality, expert-curated data from sources like GSMA, 3GPP, and O-RAN documentation, making it highly specialized for the global telecommunications sector.

Loading preview...

OTel-LLM-8.3B-IT: A Specialized Telecom Language Model

OTel-LLM-8.3B-IT is an 8.3 billion parameter language model developed by farbodtavakkoli, specifically fine-tuned for the telecommunications industry. It is part of the OTel Family of Models, an open-source initiative aimed at creating industry-standard AI for the global telecom sector. The model is based on the EssentialAI/rnj-1-instruct architecture and is licensed under Apache 2.0.

Key Capabilities & Training

This model underwent full parameter fine-tuning using an extensive dataset curated by over 200 domain experts from leading organizations like AT&T, GSMA, and Purdue University. The training data includes critical telecom-focused documents such as:

  • GSMA Permanent Reference Documents
  • 3GPP Specifications
  • O-RAN Documentation
  • RFC Series
  • Industry whitepapers and academic papers covering eSIM, networks, security, and more.

Intended Use Cases

OTel-LLM-8.3B-IT is optimized for specific applications within the telecommunications domain:

  • RAG applications: Enhancing retrieval-augmented generation systems with telecom-specific knowledge.
  • Question answering: Providing accurate answers to queries based on telecom specifications and standards.

This specialization makes it a valuable tool for developers and researchers working with telecommunications data.