vikash06/mistral_v1
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:8kPublished:Dec 23, 2023License:mitArchitecture:Transformer0.0K Open Weights Cold

vikash06/mistral_v1 is a 7 billion parameter language model, fine-tuned from Llama 2, developed by vikash06. This model was trained experimentally on a small dataset to evaluate performance with extended training on limited data. It is designed for various natural language tasks including creative writing, question answering (closed, open), summarization, information extraction, classification, and brainstorming.

Loading preview...