rhaymison/Mistral-portuguese-luana-7b-chat
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Apr 9, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

rhaymison/Mistral-portuguese-luana-7b-chat is a 7 billion parameter language model, fine-tuned from Mistral 7B by rhaymison, specifically for chat interactions in Portuguese. It was trained on a superset of 250,000 Portuguese chat conversations to address the scarcity of models in this language. This model excels at generating conversational responses and is optimized for use cases requiring natural language understanding and generation in Portuguese, particularly for chat applications.

Loading preview...