unionai/Llama-2-7b-hf-wikipedia
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kLicense:apache-2.0Architecture:Transformer0.0K Open Weights Cold

The unionai/Llama-2-7b-hf-wikipedia model is a 7 billion parameter Llama 2 architecture, fine-tuned specifically on Wikipedia data. This model is designed for tasks requiring extensive factual knowledge and information retrieval from a broad encyclopedic source. With a 4096-token context length, it is optimized for generating informative and coherent text based on its specialized training.

Loading preview...