allura-org/Bigger-Body-70b
TEXT GENERATIONConcurrency Cost:4Model Size:70BQuant:FP8Ctx Length:32kPublished:Mar 14, 2025License:llama3.3Architecture:Transformer0.0K Warm
The allura-org/Bigger-Body-70b is a 70 billion parameter language model developed by allura-org, designed for adaptive persona emulation and roleplay enhancement. It supports a 32768 token context length and is primarily optimized for narrative scaffolding and character initialization sequences. While excelling in English, it also supports Chinese, with ongoing research into its performance in that language.
Loading preview...
Popular Sampler Settings
Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.
temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p