xiaojunyy/gpt2-sft-dutch
TEXT GENERATIONConcurrency Cost:1Model Size:1BQuant:BF16Ctx Length:32kArchitecture:Transformer Warm
The xiaojunyy/gpt2-sft-dutch model is a 1 billion parameter GPT-2 based language model trained from scratch, specifically fine-tuned for Dutch language generation. This model is optimized for tasks requiring understanding and generation of Dutch text, making it suitable for applications like content creation, translation, or conversational AI in Dutch. Its training from scratch on a generator dataset suggests a foundational capability in Dutch language processing.
Loading preview...