Ramikan-BR/Qwen2-0.5B-v11
TEXT GENERATIONConcurrency Cost:1Model Size:0.5BQuant:BF16Ctx Length:32kPublished:Jul 22, 2024License:apache-2.0Architecture:Transformer Open Weights Warm

Ramikan-BR/Qwen2-0.5B-v11 is a 0.5 billion parameter Qwen2-based causal language model developed by Ramikan-BR. This model was fine-tuned from unsloth/qwen2-0.5b-bnb-4bit and optimized for faster training using Unsloth and Huggingface's TRL library. It supports a context length of 32768 tokens, making it suitable for applications requiring efficient processing of longer sequences.

Loading preview...