AI-Sweden-Models/gpt-sw3-126m-instruct
TEXT GENERATIONConcurrency Cost:1Model Size:0.2BQuant:BF16Ctx Length:2kPublished:Apr 28, 2023License:otherArchitecture:Transformer0.0K Loading

The AI-Sweden-Models/gpt-sw3-126m-instruct is a 126 million parameter decoder-only transformer language model developed by AI Sweden in collaboration with RISE and WASP WARA for Media and Language. It is an instruction-tuned variant of the GPT-Sw3 series, fine-tuned on chat and raw text instruction data. This model is designed for generating coherent text in Swedish, Norwegian, Danish, Icelandic, English, and four programming languages, excelling at performing text tasks through instruction-based generation.

Loading preview...

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p