CharlesLi/llama_2_sky_safe_o1_4o_reflect_1000_1000_full
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 13, 2025License:llama2Architecture:Transformer Open Weights Cold

The CharlesLi/llama_2_sky_safe_o1_4o_reflect_1000_1000_full model is a 7 billion parameter Llama-2-7b-chat-hf variant, fine-tuned by CharlesLi on a generator dataset. This model is based on the Llama 2 architecture and has a context length of 4096 tokens. It is optimized for tasks related to its specific fine-tuning dataset, achieving a loss of 0.7917 on its evaluation set. Its primary application is in scenarios benefiting from its specialized training on the generator dataset.

Loading preview...