klyang/MentaLLaMA-chat-13B
TEXT GENERATIONConcurrency Cost:1Model Size:13BQuant:FP8Ctx Length:4kPublished:Sep 27, 2023License:mitArchitecture:Transformer0.0K Open Weights Cold

MentaLLaMA-chat-13B, developed by klyang, is a 13 billion parameter instruction-following large language model fine-tuned from Meta's LLaMA2-chat-13B. It is specifically designed for interpretable mental health analysis, providing predictions and explanations for various mental health conditions. The model was trained on the IMHI dataset, comprising 75K high-quality natural language instructions, and is intended for non-clinical research applications.

Loading preview...