RJTPP/scot0402s-deepseek-llama-8b-REF-full
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Apr 10, 2026License:apache-2.0Architecture:Transformer Open Weights Cold
RJTPP/scot0402s-deepseek-llama-8b-REF-full is an 8 billion parameter Llama-based language model developed by RJTPP, fine-tuned from unsloth/DeepSeek-R1-Distill-Llama-8B-unsloth-bnb-4bit. This model was trained using Unsloth and Huggingface's TRL library, enabling 2x faster fine-tuning. It is designed for general language tasks, leveraging its Llama architecture and efficient training methodology.
Loading preview...