CharlesLi/llama_2_sky_safe_o1_4o_reflect_1000_100_full
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 13, 2025License:llama2Architecture:Transformer Open Weights Cold
The CharlesLi/llama_2_sky_safe_o1_4o_reflect_1000_100_full is a 7 billion parameter Llama-2-7b-chat-hf model, fine-tuned by CharlesLi. This model is based on the Llama 2 architecture and has a context length of 4096 tokens. It was fine-tuned on a specific generator dataset, achieving a loss of 0.7435 on the evaluation set. Its primary application is likely within conversational or generative tasks, leveraging its Llama 2 foundation.
Loading preview...