hamzah0asadullah/Perexiguus-0.6B
TEXT GENERATIONConcurrency Cost:1Model Size:0.8BQuant:BF16Ctx Length:32kPublished:Mar 7, 2026License:apache-2.0Architecture:Transformer Open Weights Loading

Perexiguus-0.6B by hamzah0asadullah is a 0.6 billion parameter causal language model, fine-tuned from Qwen3-0.6B, specifically optimized for diverse, roleplay-concentrated conversations. This model excels at generating engaging and coherent roleplay interactions, making it one of the smallest usable roleplaying models available. It features a native context length of 32K tokens and utilizes a GQA architecture with 28 layers, making it suitable for resource-constrained roleplaying applications.

Loading preview...