CharlesLi/llama_2_o1_01_full
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 7, 2025License:llama2Architecture:Transformer Open Weights Cold

The CharlesLi/llama_2_o1_01_full is a 7 billion parameter language model, fine-tuned from Meta's Llama-2-7b-chat-hf architecture. This model was trained for 1 epoch with a learning rate of 2e-05 and a context length of 4096 tokens. While specific differentiators are not detailed, its fine-tuning process suggests potential for specialized conversational or instruction-following tasks.

Loading preview...