CharlesLi/llama_2_alpaca_cot_simplest
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Dec 31, 2024License:llama2Architecture:Transformer Open Weights Cold

CharlesLi/llama_2_alpaca_cot_simplest is a 7 billion parameter language model fine-tuned from Meta's Llama-2-7b-chat-hf. This model was fine-tuned on an unspecified dataset, achieving a final validation loss of 0.7382. It is intended for general conversational AI tasks, leveraging the Llama 2 architecture with a 4096 token context length.

Loading preview...