benchang1110/Taiwan-tinyllama-v1.1-base
TEXT GENERATIONConcurrency Cost:1Model Size:1.1BQuant:BF16Ctx Length:2kPublished:Jan 23, 2025License:apache-2.0Architecture:Transformer0.0K Open Weights Warm

The benchang1110/Taiwan-tinyllama-v1.1-base is a continue-pretrained version of TinyLlama-v1.1, specifically tailored for traditional Chinese. This causal language model was trained on over 10 billion tokens of traditional Chinese data, making it highly specialized for text generation in this language. It requires approximately 3GB of VRAM for inference using bfloat16, offering an efficient solution for traditional Chinese language processing tasks.

Loading preview...

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p