satt0821/affine-001
TEXT GENERATIONConcurrency Cost:1Model Size:4BQuant:BF16Ctx Length:32kPublished:Dec 14, 2025Architecture:Transformer Warm
satt0821/affine-001 is a 4 billion parameter language model with a 40960 token context length. The model card indicates that further information regarding its architecture, training, and specific capabilities is needed. As such, its primary differentiators and optimal use cases are currently undefined.
Loading preview...
Popular Sampler Settings
Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.
temperature
–
top_p
–
top_k
–
frequency_penalty
–
presence_penalty
–
repetition_penalty
–
min_p
–