maritaca-ai/sabia-7b
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Nov 8, 2023Architecture:Transformer0.1K Cold

Sabiá-7B is a 7 billion parameter auto-regressive language model developed by Maritaca AI, built on the LLaMA-1-7B architecture. It was pretrained on 7 billion tokens from the Portuguese subset of ClueWeb22, with further training on an additional 10 billion tokens. This model is specifically optimized for Portuguese language tasks and is recommended for few-shot applications due to its pretraining without instruction-tuning.

Loading preview...

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p