aimeri/spoomplesmaxx-base-qwen3-14b
TEXT GENERATIONConcurrency Cost:1Model Size:14BQuant:FP8Ctx Length:32kPublished:Feb 10, 2026License:apache-2.0Architecture:Transformer0.0K Open Weights Cold
The aimeri/spoomplesmaxx-base-qwen3-14b is a 14 billion parameter continued pre-training (CPT) of the Qwen3-14B-Base model, developed by aimeri. This base model is specifically trained on a curated mix of fiction, character knowledge, prose, and domain-specific corpora, with a context length of 32768 tokens. It is designed as a foundational stage for creative writing, character roleplay, and uncensored conversational AI, emphasizing narrative style and domain knowledge over factual accuracy. This model serves as the initial CPT stage in a pipeline that will include subsequent SFT and DPO for instruction following and alignment.
Loading preview...