Danielbrdz/Barcenas-Mistral-7b
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:8kPublished:Oct 20, 2023License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

Danielbrdz/Barcenas-Mistral-7b is a 7 billion parameter language model, fine-tuned from teknium/CollectiveCognition-v1-Mistral-7B. It specializes in Spanish language processing, having been trained on a dataset derived from lmsys/lmsys-chat-1m. This model is optimized for conversational and general text generation tasks in Spanish, leveraging its 8192-token context length.

Loading preview...