farbodtavakkoli/OTel-LLM-14B-IT

Hugging Face
TEXT GENERATIONConcurrency Cost:1Model Size:14BQuant:FP8Ctx Length:32kPublished:Feb 11, 2026License:apache-2.0Architecture:Transformer Open Weights Warm

OTel-LLM-14B-IT by farbodtavakkoli is a 14 billion parameter instruction-tuned language model based on Qwen3-14B, specifically fine-tuned on extensive telecommunications domain data. This model is designed to generate accurate, context-grounded responses for telecom-specific queries, excelling in Retrieval-Augmented Generation (RAG) pipelines. Its primary strength lies in providing factual answers within the telecommunications sector, with built-in abstention training to prevent hallucination when context is insufficient.

Loading preview...

OTel-LLM-14B-IT: A Specialized Telecom Language Model

OTel-LLM-14B-IT is a 14 billion parameter instruction-tuned language model developed by farbodtavakkoli, built upon the Qwen3-14B base architecture. It is a core component of the OTel Family of Models, an open-source initiative focused on creating AI solutions for the global telecommunications industry.

Key Capabilities and Features

  • Telecom Specialization: Fine-tuned using full parameter training on a vast dataset curated by over 100 domain experts from institutions like Yale University, GSMA, IETF, Khalifa University, University of Leeds, and The University of Texas at Dallas. This data includes arXiv telecom papers, 3GPP standards, GSMA documents, IETF RFCs, industry whitepapers, and O-RAN specifications.
  • RAG Pipeline Integration: Designed to function as the generative component in end-to-end Retrieval-Augmented Generation (RAG) pipelines for telecommunications. It works in conjunction with OTel Embedding and Reranker models to retrieve, prioritize, and then generate responses grounded in relevant telecom documentation.
  • Abstention Training: Incorporates abstention training, enabling the model to decline answering if it does not receive sufficient context, thereby minimizing hallucination and ensuring context-grounded generation.
  • Open-Source Initiative: Part of a broader effort to provide open-source AI models and datasets for the telecom sector, including related OTel-LLM, OTel-Embedding, and OTel-Reranker datasets.

Intended Use Cases

This model is ideal for applications requiring precise, factual information within the telecommunications domain. It is particularly suited for:

  • Answering questions based on telecom specifications, standards, and documentation.
  • Building intelligent assistants for telecom professionals.
  • Developing knowledge retrieval systems for the telecommunications industry.

It is optimized for scenarios where responses must be strictly grounded in provided context, rather than open-ended creative generation.

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p