OpenLLM-Ro/RoMistral-7b-Instruct-2024-10-09
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Sep 23, 2024License:cc-by-nc-4.0Architecture:Transformer Open Weights Cold

OpenLLM-Ro/RoMistral-7b-Instruct-2024-10-09 is a 7 billion parameter instruction-tuned generative text model developed by OpenLLM-Ro, specialized for the Romanian language. Fine-tuned from Mistral-7B-v0.1, it leverages a diverse set of Romanian instruction datasets to enhance its performance in assistant-like chat and various natural language tasks. This model is part of the first open-source effort to build LLMs specifically for Romanian, excelling in Romanian-specific benchmarks like RoCulturaBench and demonstrating strong performance in tasks such as sentiment analysis (LaRoSeDa) and semantic textual similarity (STS) within a 4096-token context.

Loading preview...