IniNLP247/Kenko-mental-health-llama-3-model

Hugging Face
TEXT GENERATIONConcurrency Cost:1Model Size:3.2BQuant:BF16Ctx Length:32kLicense:apache-2.0Architecture:Transformer0.0K Open Weights Warm

IniNLP247/Kenko-mental-health-llama-3-model is a 3.2 billion parameter Llama 3.2-based language model fine-tuned by Inigo Chandia. This model is specifically designed to provide emotional support, functioning as a virtual therapist. It excels at addressing mental health concerns, offering consistent support to users.

Loading preview...

Model Overview

IniNLP247/Kenko-mental-health-llama-3-model is a specialized 3.2 billion parameter language model developed by Inigo Chandia. Fine-tuned from the Llama 3.2-3B-Instruct base model, its primary purpose is to offer emotional support, akin to a therapist. The fine-tuning process involved approximately 15 hours over 1.5 epochs on a large dataset of 950,000 rows, utilizing resources like RunPod and Axolotl.

Key Capabilities

  • Emotional Support: Designed to provide consistent and empathetic responses for mental health assistance.
  • Therapeutic Interaction: Aims to mimic the support structure of a therapist, addressing user distress.
  • Robust Performance: Successfully passed a series of 11 progressively distressing test questions, demonstrating its ability to handle sensitive mental health queries.

Intended Use Cases

  • Mental Health Support: Ideal for applications requiring an AI companion for emotional well-being.
  • Crisis Intervention (Non-Emergency): Can serve as an initial point of contact for individuals seeking support until further help is available.
  • Consistent Assistance: Provides reliable and continuous support for users dealing with mental health challenges.

Training Details

The model was fine-tuned using the usham/mental-health-companion-new dataset. The base model, Llama 3.2-3B-Instruct, was released on September 25, 2024, and features Grouped-Query Attention (GQA) for enhanced inference scalability. The fine-tuned model was completed on September 9, 2025.