ozertuu/Lama3.1-8B-EksiSozlukAI
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:8kPublished:Nov 25, 2024License:llama3Architecture:Transformer0.0K Warm

ozertuu/Lama3.1-8B-EksiSozlukAI is an 8 billion parameter Llama 3.1 model developed by ozertuu, fine-tuned from ytu-ce-cosmos/Turkish-Llama-8b-DPO-v0.1. This model was trained using Unsloth and Huggingface's TRL library, achieving 2x faster training speeds. It is designed for applications requiring a performant Llama 3.1 base with optimized training efficiency.

Loading preview...