CharlesLi/llama_2_cot_simplest_alpaca_3_3_epoch_full
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 21, 2025License:llama2Architecture:Transformer Open Weights Cold

The CharlesLi/llama_2_cot_simplest_alpaca_3_3_epoch_full is a 7 billion parameter language model fine-tuned from Meta's Llama-2-7b-chat-hf. This model was trained for 3 epochs on a generator dataset, achieving a loss of 0.9498 on the evaluation set. It is optimized for tasks related to its specific fine-tuning data, making it suitable for applications requiring responses aligned with the generator dataset's characteristics.

Loading preview...