abacusai/Liberated-Qwen1.5-14B is a 14.2 billion parameter language model developed by AbacusAI and Eric Hartford, based on the Qwen1.5 architecture. This model is specifically fine-tuned to enhance compliance with system prompts over long, multi-turn conversations, even with unusual or mechanical instructions. It utilizes a 32K context length and is optimized for scenarios requiring strict adherence to user-defined constraints without inherent guardrails or censorship.
No reviews yet. Be the first to review!