rhaymison/Mistral-portuguese-luana-7b
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:8kPublished:Apr 1, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold
The rhaymison/Mistral-portuguese-luana-7b is a 7 billion parameter language model, fine-tuned from the Mistral 7B architecture. Developed by rhaymison, this model is specifically trained on a superset of 200,000 Portuguese instructions to address the scarcity of Portuguese-centric LLMs. It is primarily optimized for instructional tasks in Portuguese, offering enhanced performance for applications requiring natural language understanding and generation in the language.
Loading preview...
Popular Sampler Settings
Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.
temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p