theprint/Boptruth-Agatha-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Sep 11, 2024Architecture:Transformer Cold

Boptruth-Agatha-7B is a 7 billion parameter causal language model developed by theprint, fine-tuned from Boptruth-NeuralMonarch-7B. This model is specifically optimized for assisting writers in structuring and planning crime, mystery, and thriller novels, leveraging the MysteryWriter dataset. It is designed to guide creative writing processes within these genres, offering specialized support for narrative development. The model has a context length of 4096 tokens.

Loading preview...