Sao10K/Fimbulvetr-11B-v2.1-16K
TEXT GENERATIONConcurrency Cost:1Model Size:10.7BQuant:FP8Ctx Length:4kPublished:Jun 25, 2024License:cc-by-nc-4.0Architecture:Transformer0.0K Open Weights Warm

Sao10K/Fimbulvetr-11B-v2.1-16K is a 10.7 billion parameter language model, an extended version of Fimbulvetr-v2, optimized for an increased context length of up to 16K tokens using PoSE. While it maintains coherence up to 16K, it provides consistent and reliable answers at approximately 11K context, making it suitable for long-context applications like roleplaying. Its primary differentiator is its extended context window, offering robust performance for tasks requiring recall over moderately long sequences.

Loading preview...

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p