haoranxu/ALMA-13B-Pretrain
TEXT GENERATIONConcurrency Cost:1Model Size:13BQuant:FP8Ctx Length:4kPublished:Sep 17, 2023License:mitArchitecture:Transformer0.0K Open Weights Warm
ALMA-13B-Pretrain by haoranxu is a 13 billion parameter LLaMA-2-based language model, pre-trained on 12 billion monolingual tokens. It serves as the foundational model for the ALMA (Advanced Language Model-based Translator) series, which specializes in machine translation. This specific model requires further LoRA fine-tuning with parallel data to function as a translator, distinguishing it from direct translation models.
Loading preview...
Popular Sampler Settings
Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.
temperature
–
top_p
–
top_k
–
frequency_penalty
–
presence_penalty
–
repetition_penalty
–
min_p
–