aqweteddy/llama2-7b-capybara
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 11, 2024License:llama2Architecture:Transformer Open Weights Cold
aqweteddy/llama2-7b-capybara is a 7 billion parameter language model based on the Llama 2 architecture, featuring a 4096-token context length. This model is primarily focused on processing and generating text in traditional Chinese, demonstrating capabilities in narrative generation and philosophical discourse. It is designed for applications requiring nuanced understanding and creation of Chinese linguistic content.
Loading preview...