klyang/MentaLLaMA-chat-7B-hf
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 11, 2024License:mitArchitecture:Transformer0.0K Open Weights Cold

MentaLLaMA-chat-7B is a 7 billion parameter instruction-following large language model developed by klyang, fine-tuned from Meta's LLaMA2-chat-7B. It is specifically designed for interpretable mental health analysis, providing predictions and reliable explanations for various mental health conditions. The model was trained on the 75K-instruction IMHI dataset and achieves performance comparable to state-of-the-art discriminative methods on the IMHI benchmark.

Loading preview...