CharlesLi/llama_2_cot_simplest_alpaca_0_full
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 20, 2025License:llama2Architecture:Transformer Open Weights Cold
The CharlesLi/llama_2_cot_simplest_alpaca_0_full model is a 7 billion parameter Llama-2-7b-chat-hf variant, fine-tuned by CharlesLi on a generator dataset. This model is based on the Llama 2 architecture and is optimized for specific conversational or text generation tasks, as indicated by its fine-tuning on a generator dataset. It is intended for applications requiring a specialized Llama 2 chat model with a 4096 token context length.
Loading preview...