nonetrix/sillyrp-7b
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 26, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

nonetrix/sillyrp-7b is a 7 billion parameter merged language model, created using the task arithmetic method. It combines several Mistral-7B-based models, including those fine-tuned for roleplay and DPO, to enhance conversational and creative text generation. This model is designed for experimental use in generating diverse and engaging responses, particularly in role-playing scenarios.

Loading preview...