CharlesLi/llama_2_sky_safe_o1_llama_3_8B_reflect_4000_500_full
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 13, 2025License:llama2Architecture:Transformer Open Weights Cold

CharlesLi/llama_2_sky_safe_o1_llama_3_8B_reflect_4000_500_full is a 7 billion parameter language model fine-tuned from Meta's Llama-2-7b-chat-hf. This model was trained with a 4096-token context length, focusing on a generator dataset. It is intended for tasks requiring a Llama-2-based model with specific fine-tuning, achieving a validation loss of 0.5898.

Loading preview...