AI-Sweden-Models/gpt-sw3-6.7b-v2-instruct
TEXT GENERATIONConcurrency Cost:1Model Size:7.1BQuant:FP8Ctx Length:2kPublished:Apr 28, 2023License:otherArchitecture:Transformer0.0K Cold
AI-Sweden-Models/gpt-sw3-6.7b-v2-instruct is a 7.1 billion parameter decoder-only transformer language model developed by AI Sweden in collaboration with RISE and WASP WARA for Media and Language. This instruction-tuned model is trained on a 320 billion token dataset comprising Swedish, Norwegian, Danish, Icelandic, English, and programming code. It excels at generating coherent text in five languages and four programming languages, and can perform various text tasks through instruction-based generation.
Loading preview...
Popular Sampler Settings
Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.
temperature
–
top_p
–
top_k
–
frequency_penalty
–
presence_penalty
–
repetition_penalty
–
min_p
–