CharlesLi/llama_2_sky_safe_o1_llama_3_70B_reflect_1000_100_full
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 13, 2025License:llama2Architecture:Transformer Open Weights Cold

CharlesLi/llama_2_sky_safe_o1_llama_3_70B_reflect_1000_100_full is a 7 billion parameter language model fine-tuned from Meta's Llama-2-7b-chat-hf. This model was trained on a generator dataset, achieving a loss of 0.7303 on its evaluation set. It is intended for general language generation tasks, building upon the Llama 2 architecture. Further specific details on its intended uses and limitations are not provided.

Loading preview...