UNIVA-Bllossom/DeepSeek-llama3.1-Bllossom-8B
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Feb 13, 2025License:mitArchitecture:Transformer0.0K Open Weights Warm
The DeepSeek-llama3.1-Bllossom-8B model, developed by UNIVA and Bllossom, is an 8 billion parameter language model built upon the DeepSeek-R1-Distill-Llama-8B base. It is specifically optimized to enhance reasoning performance in Korean environments, addressing limitations of its base model which was primarily trained on English and Chinese data. This model achieves improved Korean inference by performing internal reasoning in English and then generating responses in the user's input language.
Loading preview...
Popular Sampler Settings
Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.
temperature
–
top_p
–
top_k
–
frequency_penalty
–
presence_penalty
–
repetition_penalty
–
min_p
–