Nohobby/MS3-Tantum-24B-v0.1
TEXT GENERATIONConcurrency Cost:2Model Size:24BQuant:FP8Ctx Length:32kPublished:Feb 27, 2025License:apache-2.0Architecture:Transformer0.0K Open Weights Warm

Nohobby/MS3-Tantum-24B-v0.1 is a 24 billion parameter merged language model based on the Mistral-Small-24B architecture, developed by Nohobby. This model is specifically optimized for character adherence and roleplay scenarios, incorporating a unique \"\" tag functionality for internal monologue. It aims to provide strong prose generation and character consistency, potentially outperforming other 22B and 32B merges in its class.

Loading preview...