The alexzaza/Fine-Tuned-TinyLlama-Crane-Model is a 1.1 billion parameter language model developed by Alex Junior Fankem, based on the TinyLLAMA architecture and fine-tuned using LORA. With a context length of 2048 tokens, this model is designed for general language tasks, leveraging efficient fine-tuning methods to adapt the base TinyLLAMA model.
No reviews yet. Be the first to review!