andylolu24/ollm-wikipedia
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Oct 12, 2024License:apache-2.0Architecture:Transformer Open Weights Cold

The andylolu24/ollm-wikipedia model is a 7 billion parameter language model fine-tuned by andylolu24, based on the Mistral-7B-Instruct-v0.2 architecture. It has a context length of 4096 tokens and is specifically optimized for generating text based on Wikipedia content. Its primary use case is information retrieval and content generation from a Wikipedia-centric knowledge base.

Loading preview...