farbodtavakkoli/OTel-LLM-4B-IT

Hugging Face
VISIONConcurrency Cost:1Model Size:4.3BQuant:BF16Ctx Length:32kPublished:Feb 11, 2026License:apache-2.0Architecture:Transformer0.0K Open Weights Warm

OTel-LLM-4B-IT by farbodtavakkoli is a 4 billion parameter instruction-tuned language model built on Google's Gemma-3-4b-it. It is specifically fine-tuned on telecommunications domain data, making it specialized for telecom-related tasks. This model is designed to generate accurate, context-grounded responses within Retrieval-Augmented Generation (RAG) pipelines for the telecommunications sector, with a focus on abstaining from answers when context is insufficient to prevent hallucination.

Loading preview...

OTel-LLM-4B-IT: A Specialized Telecom Language Model

OTel-LLM-4B-IT is a 4 billion parameter instruction-tuned language model developed by farbodtavakkoli, based on Google's Gemma-3-4b-it. It is a core component of the OTel Family of Models, an open-source initiative aimed at creating industry-standard AI for telecommunications.

Key Capabilities & Features

  • Domain-Specific Fine-tuning: The model underwent full parameter fine-tuning using extensive telecom-focused data, curated by over 100 domain experts from institutions like Yale University, GSMA, NetoAI, Khalifa University, and the University of Leeds. This dataset includes arXiv telecom papers, 3GPP standards, GSMA documents, IETF RFCs, industry whitepapers, and O-RAN specifications.
  • RAG Pipeline Integration: Designed to function within end-to-end Retrieval-Augmented Generation (RAG) pipelines for telecommunications. It complements OTel Embedding models (for retrieving relevant chunks) and OTel Reranker models (for prioritizing retrieved information).
  • Context-Grounded Generation: Features abstention training, meaning it is optimized to decline answering if it does not receive sufficient context, thereby minimizing hallucination. This makes it particularly suitable for applications requiring high factual accuracy grounded in provided documentation.
  • Open-Source Initiative: Part of a broader effort to provide open-source AI resources for the global telecommunications sector, including related datasets like OTel-LLM and OTel-Embedding.

Intended Use Cases

  • Telecommunications RAG Systems: Ideal for building AI systems that answer questions or generate content based on telecom specifications, standards, and documentation.
  • Technical Information Retrieval: Can be used to process and understand complex telecom-specific texts.
  • Knowledge Base Querying: Suitable for applications where accurate, non-hallucinated responses from a defined knowledge base are critical.

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p