cnmoro/Mistral-7B-Portuguese
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 3, 2024Architecture:Transformer0.0K Cold

cnmoro/Mistral-7B-Portuguese is a 7 billion parameter language model, fine-tuned from mistralai/Mistral-7B-Instruct-v0.2, specifically optimized for performance in the Portuguese language. It utilizes a 4096-token context window and was trained using Unsloth on an instruction-based Portuguese dataset. This model aims to improve instruction-following capabilities and general language understanding for Portuguese applications.

Loading preview...