BramVanroy/llama2-13b-ft-mc4_nl_cleaned_tiny
TEXT GENERATIONConcurrency Cost:1Model Size:13BQuant:FP8Ctx Length:4kLicense:apache-2.0Architecture:Transformer0.0K Open Weights Cold

BramVanroy/llama2-13b-ft-mc4_nl_cleaned_tiny is a 13 billion parameter Llama 2 model fine-tuned by Bram Vanroy. It was trained on the 'tiny' partition of the yhavinga/mc4_nl_cleaned dataset with a 4096-token context length. This model is specifically optimized to improve fluency in Dutch, making it a generative model primarily for Dutch language tasks. It is suitable for further fine-tuning on tasks like summarization or instruction following.

Loading preview...