hyunseoki/ko-en-llama2-13b
TEXT GENERATIONConcurrency Cost:1Model Size:13BQuant:FP8Ctx Length:4kPublished:Oct 2, 2023Architecture:Transformer0.0K Warm

The hyunseoki/ko-en-llama2-13b model, developed by HyunseokLee and TaeyoungKim, is a 13 billion parameter auto-regressive language model built on the LLaMA2 transformer architecture with a 4096-token context length. It is specifically trained on a combination of English and Korean datasets (Open dataset wiki and AIhub) to maintain LLaMA2's English proficiency while learning Korean. This model is designed for applications requiring strong bilingual text generation and understanding in both Korean and English.

Loading preview...

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p