mhenrichsen/danskgpt-tiny
TEXT GENERATIONConcurrency Cost:1Model Size:1.1BQuant:BF16Ctx Length:2kPublished:Jan 3, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Warm

DanskGPT-tiny is a 1.1 billion parameter LLaMA-based large language model developed by mhenrichsen. It is a continuation of the TinyLLaMA training, specifically designed for Danish language processing. Trained on 8 billion tokens of synthetic Danish text, this foundation/completion model is optimized for generating Danish text rather than conversational chat.

Loading preview...