Local-Novel-LLM-project/WabiSabi-V1
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Sep 30, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

WabiSabi-V1 is a 7 billion parameter Large Language Model developed by Local-Novel-LLM-project, fine-tuned from Mistral-7B-v0.1. It features an expanded 128k context window, significantly larger than its base model, and excels at generating both high-quality Japanese and English text. This model is notable for its ability to maintain memory over long-context generations and can produce NSFW content, making it suitable for diverse creative and specialized applications requiring extended conversational coherence.

Loading preview...