AI-Sweden-Models/gpt-sw3-1.3b-instruct
TEXT GENERATIONConcurrency Cost:1Model Size:1.4BQuant:BF16Ctx Length:2kPublished:Apr 28, 2023License:otherArchitecture:Transformer0.0K Loading

The AI-Sweden-Models/gpt-sw3-1.3b-instruct is a 1.4 billion parameter decoder-only transformer language model developed by AI Sweden in collaboration with RISE and WASP WARA for Media and Language. This instruction-tuned model is designed for generating coherent text in Swedish, Norwegian, Danish, Icelandic, English, and programming languages, and can perform various text tasks through instruction following. It was trained on a 320 billion token dataset, emphasizing Nordic languages and code, making it particularly suitable for multilingual applications in these regions.

Loading preview...

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p