bitext/Mistral-7B-Telco
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kLicense:apache-2.0Architecture:Transformer0.0K Open Weights Cold

bitext/Mistral-7B-Telco is a 7 billion parameter language model developed by Bitext, fine-tuned from Mistral-7B-Instruct-v0.2 with a 4096-token context length. This model is specifically optimized for the telecommunications (Telco) domain, excelling at answering questions and assisting with Telco-related procedures. It was trained using hybrid synthetic data and Bitext's Data Labeling tools to facilitate the creation of verticalized enterprise models for chatbots and virtual assistants.

Loading preview...