orai-nlp/Llama-eus-8B
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Sep 4, 2024Architecture:Transformer0.0K Warm

Llama-eus-8B is an 8 billion parameter foundational large language model developed by Orai NLP Technologies, adapted from Meta's Llama 3.1. It is specifically tailored for the Basque language through continual pretraining on 1.5 billion high-quality Basque tokens from the ZelaiHandi dataset, alongside a subset of FineWeb. This model significantly enhances linguistic performance in Basque, demonstrating notable improvements in formal and functional linguistic competence while largely retaining its general English capabilities. With a 32768 token context length, it is optimized for natural language understanding and instruction following in low-resource languages like Basque.

Loading preview...

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
–
frequency_penalty
presence_penalty
repetition_penalty
–
min_p
–