SchubergPhilis/TinyLlama-1.1B-Chat-v0.6-ENG
Hugging Face
TEXT GENERATIONConcurrency Cost:1Model Size:1.1BQuant:BF16Ctx Length:2kLicense:apache-2.0Architecture:Transformer Open Weights Warm

SchubergPhilis/TinyLlama-1.1B-Chat-v0.6-ENG is a 1.1 billion parameter language model developed by Schuberg Philis and Anoosh Ahmadi. Based on the TinyLlama architecture, this model is specifically fine-tuned on English conversations from the OpenAssistant-Top1-ENG-V1 dataset. It is optimized for chat-based applications and generating human-like responses in English.

Loading preview...

Model Overview

SchubergPhilis/TinyLlama-1.1B-Chat-v0.6-ENG is a 1.1 billion parameter language model developed by Schuberg Philis and Anoosh Ahmadi. It is built upon the TinyLlama-1.1B-intermediate-step-955k-token-2T base model, originally created by jzhang38. This version is specifically fine-tuned for chat applications.

Key Characteristics

  • Base Model: Utilizes the TinyLlama-1.1B-intermediate-step-955k-token-2T architecture.
  • Fine-tuning: Trained exclusively on English conversations from the SchubergPhilis/OpenAssistant-Top1-ENG-V1 dataset.
  • Format: Provided as SafeTensor model files.

Intended Use Cases

This model is particularly well-suited for:

  • English Chatbots: Generating conversational responses in English.
  • Dialogue Systems: Implementing basic dialogue functionalities where a compact model is preferred.
  • Resource-Constrained Environments: Its small size (1.1B parameters) makes it efficient for deployment in environments with limited computational resources.