The farbodtavakkoli/OTel-LLM-8.3B-Safety is an 8.3 billion parameter language model, fine-tuned by farbodtavakkoli, specifically for the telecommunications domain. Built upon the EssentialAI/rnj-1-instruct base model, it is optimized for RAG applications and question answering on telecom specifications and standards. This model is part of the OTel Family of Models, an open-source initiative focused on AI for the global telecommunications sector.
Loading preview...
OTel-LLM-8.3B-Safety: Telecom-Specialized Language Model
OTel-LLM-8.3B-Safety is an 8.3 billion parameter language model developed by farbodtavakkoli, specifically fine-tuned for the telecommunications industry. It is a key component of the OTel Family of Models, an open-source effort to create industry-standard AI for the global telecom sector.
Key Capabilities and Training
This model underwent full parameter fine-tuning using high-quality, telecom-focused data. This specialized dataset was curated by over 200 domain experts from leading organizations including AT&T, GSMA, Purdue University, and Yale University. The training data encompasses a wide range of telecommunications documentation, such as:
- GSMA Permanent Reference Documents
- 3GPP Specifications
- O-RAN Documentation
- RFC Series
- Industry whitepapers and academic papers covering eSIM, terminals, security, networks, roaming, and APIs.
Intended Use Cases
OTel-LLM-8.3B-Safety is specifically optimized for applications within the telecommunications domain:
- RAG applications: Enhancing retrieval-augmented generation systems with telecom-specific knowledge.
- Question answering: Providing accurate answers based on complex telecom specifications and standards.
This model leverages a robust training infrastructure, utilizing ScalarLM and compute resources from TensorWave (AMD GPUs) and Azure (NVIDIA GPUs), ensuring its specialized performance for telecom-related tasks.