Ramikan-BR/tinyllama-coder-py-v11
TEXT GENERATIONConcurrency Cost:1Model Size:1.1BQuant:BF16Ctx Length:2kPublished:May 26, 2024License:apache-2.0Architecture:Transformer Open Weights Cold
Ramikan-BR/tinyllama-coder-py-v11 is a 1.1 billion parameter TinyLlama model developed by Ramikan-BR, fine-tuned for code generation, specifically Python. This model was trained using Unsloth for accelerated finetuning, leveraging Huggingface's TRL library. It is optimized for efficient code-related tasks within a 2048 token context window.
Loading preview...
Model Overview
Ramikan-BR/tinyllama-coder-py-v11 is a 1.1 billion parameter language model based on the TinyLlama architecture, developed by Ramikan-BR. It is specifically fine-tuned for code generation tasks, with a focus on Python, utilizing the code.evol.instruct.wiz.oss_python.json dataset.
Key Characteristics
- Architecture: TinyLlama, a compact and efficient causal language model.
- Parameter Count: 1.1 billion parameters, making it suitable for resource-constrained environments.
- Context Length: Supports a context window of 2048 tokens.
- Training Efficiency: Finetuned using Unsloth, which enabled 2x faster training, in conjunction with Huggingface's TRL library.
Use Cases
This model is particularly well-suited for:
- Python Code Generation: Generating Python code snippets or functions based on instructions.
- Code Completion: Assisting developers with code completion in Python.
- Educational Tools: Integrating into tools for learning or practicing Python programming due to its smaller size and specialized focus.