UniLLMer/CasAuTabom24BcmlKaajtmentKaa12816
TEXT GENERATIONConcurrency Cost:2Model Size:24BQuant:FP8Ctx Length:32kLicense:apache-2.0Architecture:Transformer Open Weights Cold

UniLLMer/CasAuTabom24BcmlKaajtmentKaa12816 is a 24 billion parameter Mistral-based language model developed by UniLLMer, fine-tuned from Casual-Autopsy/The-True-Abomination-24B. It incorporates a mix of ShareGPT chatlogs, Alpacatized instructions, and mental psychology concepts. This model was trained using Unsloth and Huggingface's TRL library, focusing on specific finetuning techniques.

Loading preview...