beomi/gemma-ko-7b
TEXT GENERATIONConcurrency Cost:1Model Size:8.5BQuant:FP8Ctx Length:8kPublished:Mar 2, 2024License:gemma-terms-of-useArchitecture:Transformer0.1K Cold
Gemma-Ko-7b is an 8.5 billion parameter decoder-only language model developed by Junbum Lee (Beomi) and Taekyoon Choi, based on Google's Gemma architecture. This model is specifically designed for generating Korean and English text, making it suitable for various text generation tasks including question answering, summarization, and reasoning in both languages. Its relatively compact size and efficient architecture allow for deployment in resource-limited environments, democratizing access to advanced AI capabilities.
Loading preview...
Popular Sampler Settings
Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.
temperature
–
top_p
–
top_k
–
frequency_penalty
–
presence_penalty
–
repetition_penalty
–
min_p
–