Wothmag07/counseLLM
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Apr 4, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

Wothmag07/counseLLM is an 8 billion parameter causal language model developed by Gowtham Arulmozhii, fine-tuned from Llama 3.1 8B Instruct with a 32K context length. It is specifically aligned for empathetic conversational support, utilizing a two-stage Supervised Fine-Tuning (SFT) and Direct Preference Optimization (DPO) pipeline on diverse counseling datasets. This model excels at providing supportive, empathetic guidance for research and educational purposes in AI-assisted mental health support.

Loading preview...