CharlesLi/llama_2_llama_2_alpaca_1_full
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 20, 2025License:llama2Architecture:Transformer Open Weights Cold

The CharlesLi/llama_2_llama_2_alpaca_1_full model is a 7 billion parameter language model, fine-tuned from Meta's Llama-2-7b-chat-hf. It is specifically adapted using a generator dataset, achieving a reported loss of 1.3132 on its evaluation set. This model is intended for tasks benefiting from its specialized fine-tuning, though specific use cases and limitations require further definition.

Loading preview...