emozilla/scifi-fantasy-author-7b-8k_delta
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kLicense:apache-2.0Architecture:Transformer0.0K Open Weights Cold
The emozilla/scifi-fantasy-author-7b-8k_delta is a 7 billion parameter LLaMA-based delta model, specifically fine-tuned for generating narrative fiction. It excels in the Science Fiction and Fantasy genres, leveraging an 8192 token context length for coherent storytelling. This model is designed to be applied on top of original LLaMA weights, providing specialized creative writing capabilities.
Loading preview...
Model Overview
The emozilla/scifi-fantasy-author-7b-8k_delta is a 7-billion parameter LLaMA-based model fine-tuned for creative narrative generation. This is a delta model and requires application on top of the original LLaMA weights, as detailed in the FastChat instructions.
Key Capabilities
- Genre-Specific Narrative Generation: Specialized in producing fictional content within the Science Fiction and Fantasy genres.
- Contextual Coherence: Trained with an 8192 token context length, enabling the generation of longer, more coherent story segments.
- Optimized Training: Achieved a training loss of 2.008 over 3 epochs, utilizing a Cosine scheduler and a learning rate of 2e-5.
Good For
- Creative Writers: Assisting authors in generating plot points, character dialogues, or descriptive passages for sci-fi and fantasy stories.
- Game Developers: Creating lore, quest descriptions, or dynamic narrative elements for games in speculative fiction settings.
- Prototyping Narratives: Rapidly developing story ideas and exploring different narrative directions within its specialized genres.