arif-butt/tinyllama-trl-merged
TEXT GENERATIONConcurrency Cost:1Model Size:1.1BQuant:BF16Ctx Length:2kPublished:Mar 25, 2026License:apache-2.0Architecture:Transformer Open Weights Warm

arif-butt/tinyllama-trl-merged is a 1.1 billion parameter Llama-based transformer decoder model, fine-tuned by arif-butt using the TRL framework. This standalone model, with a 2048-token context length, has its LoRA weights permanently merged, eliminating the need for adapter libraries. It is specifically optimized for conversational responses to educational Q&A, making it suitable for production deployment in FP16 precision.

Loading preview...