Ramikan-BR/Qwen2-0.5B-v17
TEXT GENERATIONConcurrency Cost:1Model Size:0.5BQuant:BF16Ctx Length:32kPublished:Jul 28, 2024License:apache-2.0Architecture:Transformer Open Weights Cold

Ramikan-BR/Qwen2-0.5B-v17 is a 0.5 billion parameter Qwen2-based causal language model developed by Ramikan-BR. This model was fine-tuned from unsloth/qwen2-0.5b-bnb-4bit using Unsloth and Huggingface's TRL library, achieving a 2x faster training speed. It supports a context length of 32768 tokens and is suitable for applications requiring efficient, smaller-scale language processing.

Loading preview...