migtissera/Tess-10.7B-v1.5b
TEXT GENERATIONConcurrency Cost:1Model Size:10.7BQuant:FP8Ctx Length:4kPublished:Jan 28, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Warm

Tess-10.7B-v1.5b is a 10.7 billion parameter general-purpose large language model developed by migtissera, built upon the SOLAR-10.7B base architecture. This model is designed for broad applicability across various natural language processing tasks, offering a balanced performance profile for general use cases. With a 4096-token context length, it provides a solid foundation for conversational AI and text generation. Its primary strength lies in its versatility as a foundational model for diverse applications.

Loading preview...

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p