shisa-ai/shisa-v2-llama3.1-8b
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Apr 12, 2025License:llama3.1Architecture:Transformer0.0K Warm

The shisa-ai/shisa-v2-llama3.1-8b is an 8 billion parameter, Llama 3.1-based bilingual Japanese and English (JA/EN) general-purpose chat model developed by Shisa.AI. It features a 32,768 token context length and is specifically optimized for superior performance in Japanese language tasks while maintaining strong English capabilities. This model excels in various Japanese benchmarks, outperforming its base model and other similar-sized models.

Loading preview...

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p