yzhuang/Llama-2-7b-chat-hf_fictional_arc_easy_english_v3
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Apr 26, 2024License:llama2Architecture:Transformer Open Weights Cold
The yzhuang/Llama-2-7b-chat-hf_fictional_arc_easy_english_v3 model is a 7 billion parameter Llama-2-chat-hf variant, fine-tuned by yzhuang. This model is based on Meta's Llama-2 architecture and is specifically adapted from the meta-llama/Llama-2-7b-chat-hf base model. It is designed for conversational applications, leveraging its 4096-token context length for engaging in chat-based interactions.
Loading preview...