CharlesLi/llama_2_rlhf_safe_4o_default_1000_full
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 13, 2025License:llama2Architecture:Transformer Open Weights Cold
The CharlesLi/llama_2_rlhf_safe_4o_default_1000_full is a 7 billion parameter Llama-2-7b-chat-hf model fine-tuned by CharlesLi. This model is specifically fine-tuned on a generator dataset, indicating an optimization for text generation tasks. It maintains the Llama 2 architecture with a 4096 token context length, focusing on safe and aligned outputs through RLHF.
Loading preview...