Elizezen/Himeyuri-v0.1-12B
TEXT GENERATIONConcurrency Cost:1Model Size:12BQuant:FP8Ctx Length:32kPublished:Aug 26, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Warm
Elizezen/Himeyuri-v0.1-12B is an experimental 12 billion parameter model built upon the Mistral-Nemo-Instruct-2407 architecture. This model is specifically explored for its promising performance in Japanese language processing, particularly for creative text generation. It is primarily designed for novel generation, offering capabilities in generating extended narrative content. The model is not optimized for role-playing or instruction-based responses.
Loading preview...
Popular Sampler Settings
Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.
temperature
top_p
top_k
–
frequency_penalty
–
presence_penalty
–
repetition_penalty
min_p
–