alexzaza/Fine-Tuned-TinyLlama-Crane-Model
TEXT GENERATIONConcurrency Cost:1Model Size:1.1BQuant:BF16Ctx Length:2kPublished:Apr 4, 2025Architecture:Transformer Warm

The alexzaza/Fine-Tuned-TinyLlama-Crane-Model is a 1.1 billion parameter language model developed by Alex Junior Fankem, based on the TinyLLAMA architecture and fine-tuned using LORA. With a context length of 2048 tokens, this model is designed for general language tasks, leveraging efficient fine-tuning methods to adapt the base TinyLLAMA model.

Loading preview...