dtorres-zAgile/llama2-7b-zc-domain-misti
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Nov 2, 2023Architecture:Transformer Cold

The dtorres-zAgile/llama2-7b-zc-domain-misti model is a 7 billion parameter Llama 2-based language model, fine-tuned from Meta's Llama-2-7b-chat-hf. This model is specifically adapted for a particular domain, demonstrating a validation loss of 1.9632 after 20 training steps. It is intended for specialized applications within its fine-tuned domain, leveraging the Llama 2 architecture for targeted performance.

Loading preview...