CharlesLi/llama_2_sky_safe_o1_llama_3_70B_default_4000_1000_full
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 13, 2025License:llama2Architecture:Transformer Open Weights Cold
CharlesLi/llama_2_sky_safe_o1_llama_3_70B_default_4000_1000_full is a 7 billion parameter language model fine-tuned from Meta's Llama-2-7b-chat-hf. This model was trained on a specific generator dataset, achieving a validation loss of 0.6372. It is intended for general language generation tasks based on its Llama-2 foundation, with potential specialization from its fine-tuning data. The model was trained with a 4096 token context length.
Loading preview...