TinyLlama/TinyLlama-1.1B-step-50K-105b
TEXT GENERATIONConcurrency Cost:1Model Size:1.1BQuant:BF16Ctx Length:2kPublished:Sep 1, 2023License:apache-2.0Architecture:Transformer0.1K Open Weights Loading
TinyLlama/TinyLlama-1.1B-step-50K-105b is an intermediate checkpoint of the TinyLlama project, developed by jzhang38, featuring a 1.1 billion parameter Llama-2-like architecture. This model is designed for pretraining on a massive 3 trillion token dataset, aiming for a compact yet capable language model. It is particularly optimized for applications requiring a restricted computational and memory footprint, offering Llama-compatible integration.
Loading preview...
Popular Sampler Settings
Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.
temperature
–
top_p
–
top_k
–
frequency_penalty
–
presence_penalty
–
repetition_penalty
–
min_p
–