Hachipo/llama3-8B-Instruct_PIFT-enja_manywords_2000 is an 8 billion parameter instruction-tuned language model based on the Llama 3 architecture. This model is fine-tuned for English and Japanese, focusing on generating diverse and extensive word outputs. It is designed for applications requiring detailed and varied text generation in both languages, with a context length of 8192 tokens.
No reviews yet. Be the first to review!