gutentag/alpaca-lora
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kLicense:mitArchitecture:Transformer Open Weights Cold

The gutentag/alpaca-lora is a 7 billion parameter language model developed by gutentag. This model is a fine-tuned version of the LLaMA architecture, specifically adapted using the LoRA method. It is designed for general-purpose conversational AI tasks, offering efficient performance for various natural language understanding and generation applications.

Loading preview...