Precog 24B v1: A Creative Reasoning Model
Precog 24B v1, developed by TheDrummer, is a 24 billion parameter language model with a 32768 token context length, specifically engineered for creative and entertainment-focused applications. Unlike conventional models that prioritize intelligence or problem-solving, Precog emphasizes creativity, usability, and dynamic storytelling.
Key Capabilities
- Unique Reasoning Process: Precog generates a concise draft or 'think' overview before crafting the full response, allowing users to inspect or modify the draft to influence the final output. This planning mechanism aims to improve narrative coherence and prompt adherence.
- Enhanced Narrative Flow: The model is optimized for superior storytelling and dynamic prose, making it suitable for creative writing and role-playing scenarios.
- User-Modifiable Planning: The
<think> format is standard, enabling users to prefill or edit the model's internal reasoning process, potentially leading to better instruction following and tailored responses. - Focus on (Dis)alignment: TheDrummer's philosophy for this model includes exploring varied 'attitudes' and 'moralities' in AI responses, moving away from strict corporate or ethical alignments to broaden imaginative scope.
Good For
- Creative Writing & Storytelling: Excels in generating compelling narratives and dynamic content.
- Role-Playing (RP): Its reasoning process is particularly well-suited for maintaining character and plot details in RP contexts.
- Applications Requiring Flexible AI Persona: Ideal for use cases where a model's 'attitude' and 'morality' can be explored without strict alignment constraints.
- Draft-Assisted Content Generation: Useful for users who benefit from an AI-generated outline or plan before the final text is produced, saving time and effort.