aiplanet/buddhi-128k-chat-7b
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:8kPublished:Apr 2, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

Buddhi-128K-Chat is a 7 billion parameter general-purpose chat model developed by AI Planet, fine-tuned on Mistral 7B Instruct. It is optimized for an extended context length of up to 128,000 tokens using the YaRN technique. This model excels at tasks requiring extensive context retention, such as comprehensive document summarization, detailed narrative generation, and intricate question-answering.

Loading preview...

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p