emillykkejensen/Llama-3-8B-instruct-dansk
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:8kLicense:otherArchitecture:Transformer0.0K Warm

emillykkejensen/Llama-3-8B-instruct-dansk is an 8 billion parameter instruction-tuned causal language model, fine-tuned from Meta's Llama-3-8B. This model is specifically optimized for Danish language instruction following, utilizing the kobprof/skolegpt-instruct dataset. It aims to provide enhanced performance for Danish-specific natural language processing tasks, achieving a loss of 0.9477 on its evaluation set.

Loading preview...