Delta-Vector/Hamanasu-QwQ-V2-RP
TEXT GENERATIONConcurrency Cost:2Model Size:32.8BQuant:FP8Ctx Length:32kPublished:Mar 12, 2025Architecture:Transformer0.0K Warm
Hamanasu 32B by Delta-Vector is a chat-tuned language model, based on Delta-Vector/Hamanasu-QwQ-V1.5-Instruct, specifically fine-tuned for highly unconventional and 'brainrotted' chat interactions. It was trained on 10 million tokens, including data from platforms like Bsky, 4chan, and Discord logs, making it distinctively suited for humorous and non-standard conversational engagement rather than traditional roleplay.
Loading preview...
Popular Sampler Settings
Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.
temperature
top_p
top_k
β
frequency_penalty
presence_penalty
repetition_penalty
min_p