Ramikan-BR/tinyllama-coder-py-v21
TEXT GENERATIONConcurrency Cost:1Model Size:1.1BQuant:BF16Ctx Length:2kLicense:apache-2.0Architecture:Transformer Open Weights Cold

Ramikan-BR/tinyllama-coder-py-v21 is a 1.1 billion parameter Llama-based model developed by Ramikan-BR, fine-tuned from unsloth/tinyllama-chat-bnb-4bit. This model was trained using Unsloth and Huggingface's TRL library, enabling faster training. It is optimized for code generation tasks, particularly in Python, leveraging its efficient training methodology.

Loading preview...