IIC/RigoChat-7b-v2
TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:Nov 22, 2024License:otherArchitecture:Transformer0.0K Warm
IIC/RigoChat-7b-v2 is a 7.6 billion parameter Qwen-2.5-based causal language model developed by Instituto de Ingeniería del Conocimiento (IIC). It is specifically fine-tuned using Direct Preference Optimization (DPO) for enhanced performance in Spanish language tasks, particularly excelling in generalist tasks and Retriever Augmented Generation (RAG) systems with Spanish databases by reducing hallucinations. The model supports a 131072 token context length and is optimized for various NLP tasks including Tool Use, Summarization, Math, Code, and Abstractive-QA in Spanish.
Loading preview...
Popular Sampler Settings
Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.
temperature
–
top_p
–
top_k
–
frequency_penalty
–
presence_penalty
–
repetition_penalty
–
min_p
–