CharlesLi/llama_2_sky_safe_o1_4o_default_4000_100_full
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 13, 2025License:llama2Architecture:Transformer Open Weights Cold
The CharlesLi/llama_2_sky_safe_o1_4o_default_4000_100_full is a 7 billion parameter language model, fine-tuned from Meta's Llama-2-7b-chat-hf. This model was trained with a context length of 4096 tokens and optimized for general conversational tasks. It demonstrates a training loss of 0.5437, indicating its performance on the fine-tuning dataset. Its primary application is in chat-based interactions and general language generation.
Loading preview...