The mpasila/Llama-3.1-Literotica-8B is an 8 billion parameter Llama 3.1 model, fine-tuned by mpasila, specifically for generating creative writing in the style of Literotica stories. It was trained for one epoch on a subset of the Literotica dataset, chunked to fit an 8192-token context window. This model is optimized for narrative generation with a focus on specific thematic content.
No reviews yet. Be the first to review!