hdeldar/llama-2-7b-persian-text-1k-1
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kLicense:apache-2.0Architecture:Transformer0.0K Open Weights Cold
The hdeldar/llama-2-7b-persian-text-1k-1 is a 7 billion parameter Llama 2 model, fine-tuned by hdeldar using QLoRA (4-bit precision) on a subset of the Persian-Text-QA dataset. This model is specifically designed for educational purposes, focusing on Persian text generation and understanding. It leverages a 4096-token context length and is optimized for learning and experimentation in a Google Colab environment.
Loading preview...