rsinema/Qwen2.5-0.5B-Instruct-dm
TEXT GENERATIONConcurrency Cost:1Model Size:0.5BQuant:BF16Ctx Length:32kPublished:Oct 14, 2024License:apache-2.0Architecture:Transformer Open Weights Warm
rsinema/Qwen2.5-0.5B-Instruct-dm is a 0.5 billion parameter instruction-tuned causal language model, fine-tuned from Qwen/Qwen2.5-0.5B-Instruct. This model was specifically fine-tuned on a fantasy books text dataset, indicating a specialization in generating or understanding content related to fantasy literature. It supports a very long context length of 131072 tokens and is designed for tasks requiring deep contextual understanding within its specialized domain.
Loading preview...