hon9kon9ize/CantoneseLLMChat-v1.0-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:Oct 2, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Warm
The hon9kon9ize/CantoneseLLMChat-v1.0-7B is a 7.6 billion parameter instruction-tuned causal language model built upon Qwen 2.5 7B. Developed by hon9kon9ize, it is specifically optimized for Cantonese conversation and understanding Hong Kong-related knowledge. This model excels in Cantonese linguistics and cultural understanding, making it suitable for applications requiring deep regional context.
Loading preview...
Popular Sampler Settings
Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.
temperature
top_p
–
top_k
–
frequency_penalty
–
presence_penalty
–
repetition_penalty
–
min_p
–