sail/Sailor-7B-Chat
TEXT GENERATIONConcurrency Cost:1Model Size:7.7BQuant:FP8Ctx Length:32kPublished:Mar 2, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold
Sailor-7B-Chat is a 7.7 billion parameter instruction-tuned causal language model developed by sail, built upon the Qwen 1.5 architecture. It is specifically tailored for South-East Asian languages, including Indonesian, Thai, Vietnamese, Malay, and Lao, while maintaining proficiency in English and Chinese. The model excels at tasks such as question answering and commonsense reasoning across these diverse linguistic landscapes, with a context length of 32768 tokens.
Loading preview...
Popular Sampler Settings
Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.
temperature
–
top_p
–
top_k
–
frequency_penalty
–
presence_penalty
–
repetition_penalty
–
min_p
–