CharlesLi/llama_2_sky_safe_o1_4o_reflect_4000_1000_full
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 13, 2025License:llama2Architecture:Transformer Open Weights Cold
The CharlesLi/llama_2_sky_safe_o1_4o_reflect_4000_1000_full is a 7 billion parameter Llama-2-7b-chat-hf model fine-tuned by CharlesLi. This model was trained with a learning rate of 2e-05 and a cosine learning rate scheduler over one epoch. While specific differentiators are not detailed, it is based on the Llama 2 architecture, making it suitable for general conversational AI tasks.
Loading preview...