icefog72/IceTeaRP-7b
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:8kPublished:Mar 28, 2024License:cc-by-nc-4.0Architecture:Transformer0.0K Open Weights Cold

IceTeaRP-7b by icefog72 is a 7 billion parameter language model, merged using the SLERP method from Kunokukulemonchini-7b and a BigLM7-7b merge. This model is designed to handle a 32k context window without scaling, making it suitable for applications requiring extended conversational memory or document processing. It is particularly noted for its performance in roleplay scenarios, offering a balance of coherence and context retention.

Loading preview...

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p