JJhooww/Mistral-7B-v0.2-Base_ptbr
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:8kLicense:apache-2.0Architecture:Transformer0.0K Open Weights Cold

JJhooww/Mistral-7B-v0.2-Base_ptbr is a 7 billion parameter base model, pre-trained with approximately 1 billion tokens in Portuguese, initialized with official Mistral weights. This model is designed for further fine-tuning, as it does not follow instructions out-of-the-box. It demonstrates notable improvements in Portuguese language understanding and generation tasks compared to the original Mistral Base model. Its primary strength lies in providing a robust foundation for developing specialized Portuguese-centric LLM applications.

Loading preview...