zahnna/ru_hate_LLAMA2_7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 10, 2024Architecture:Transformer Cold

The zahnna/ru_hate_LLAMA2_7B is a 7 billion parameter language model based on the Llama 2 architecture. This model is shared by zahnna and has a context length of 4096 tokens. Its specific fine-tuning or primary differentiator is not detailed in the provided information, suggesting it may be a base or general-purpose Llama 2 variant. It is suitable for general natural language processing tasks where a 7B parameter model with a 4K context window is appropriate.

Loading preview...