farbodtavakkoli/OTel-LLM-3B-IT

Hugging Face
TEXT GENERATIONConcurrency Cost:1Model Size:3BQuant:BF16Ctx Length:32kPublished:Feb 12, 2026License:apache-2.0Architecture:Transformer Open Weights Warm

OTel-LLM-3B-IT by farbodtavakkoli is a 3 billion parameter instruction-tuned language model specialized for the telecommunications domain, built upon Mistral-3-3B. Fine-tuned on extensive telecom-focused data curated by over 100 domain experts, it excels at generating accurate responses grounded in telecommunications standards, specifications, and documentation. This model is primarily designed for Retrieval-Augmented Generation (RAG) pipelines within the telecom sector, featuring abstention training to prevent hallucinations when context is insufficient.

Loading preview...

OTel-LLM-3B-IT: A Specialized Telecom Language Model

OTel-LLM-3B-IT, developed by farbodtavakkoli, is a 3 billion parameter instruction-tuned language model specifically designed for the telecommunications industry. Built on the mistralai/Mistral-3-3B base model, it has undergone full parameter fine-tuning using a unique dataset curated by over 100 domain experts from institutions like Yale University, GSMA, and The University of Texas at Dallas. This model is part of the broader OTel Family of Models, an open-source initiative aimed at establishing industry-standard AI for global telecom.

Key Capabilities

  • Telecom Domain Expertise: Fine-tuned on a comprehensive dataset including arXiv telecom papers, 3GPP standards, GSMA documents, IETF RFCs, and O-RAN specifications.
  • RAG Optimization: Designed to power end-to-end Retrieval-Augmented Generation (RAG) pipelines for telecommunications, working in conjunction with OTel Embedding and Reranker models.
  • Hallucination Prevention: Incorporates abstention training, enabling the model to decline answering when insufficient context is provided, ensuring grounded and reliable outputs.

Good for

  • Generating accurate responses to queries based on telecom standards and documentation.
  • Developing specialized RAG applications within the telecommunications sector.
  • Researchers and developers working with large volumes of technical telecom data who require context-grounded language generation.