The danish-foundation-models/gemma-3-1b-scratch-dynaword-full-v1 is a 1 billion parameter language model, part of the Gemma family, developed by danish-foundation-models. It was specifically trained as part of the Dynaword paper to demonstrate improvements from training on the Danish Dynaword dataset. This model is optimized for tasks requiring understanding and generation of Danish language, leveraging its 32768 token context length.
No reviews yet. Be the first to review!