CharlesLi/llama_2_llama_2_alpaca_2_full
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 20, 2025License:llama2Architecture:Transformer Open Weights Cold

The CharlesLi/llama_2_llama_2_alpaca_2_full model is a 7 billion parameter language model fine-tuned from Meta's Llama-2-7b-chat-hf. This model was fine-tuned on a specific "generator dataset" and achieved a loss of 0.9404 on its evaluation set. It is intended for general language generation tasks, building upon the conversational capabilities of its Llama-2 base.

Loading preview...