codingwithlewis/mistralmeme
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Apr 10, 2024Architecture:Transformer Cold

The codingwithlewis/mistralmeme is a 7 billion parameter language model with a 4096 token context length. This model is a Hugging Face Transformers model, automatically pushed to the Hub. Due to limited information in its model card, specific differentiators or primary use cases beyond being a general language model are not detailed.

Loading preview...