DavidAU/Qwen3-8B-192k-Context-6X-Josiefied-Uncensored
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kArchitecture:Transformer0.0K Cold

DavidAU/Qwen3-8B-192k-Context-6X-Josiefied-Uncensored is an 8 billion parameter Qwen3-based causal language model, derived from Goekdeniz-Guelmez's "Josiefied-Qwen3-8B-abliterated-v1." This model significantly extends the original 32k context to 192k tokens using YARN, making it suitable for tasks requiring extensive context processing. It is designed for long-form output and creative generation, with specific recommendations for optimal inference settings.

Loading preview...