0xA50C1A1/Llama-3.3-8B-Instruct-OmniWriter-v2
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Feb 23, 2026License:llama3.3Architecture:Transformer0.0K Cold

Llama-3.3-8B-Instruct-OmniWriter-v2 is an 8 billion parameter instruction-tuned causal language model from 0xA50C1A1, based on the Llama 3.3 architecture with a 32768 token context length. This model is specifically fine-tuned using DPO to bias its output towards a "show, don't tell" narrative style, making it highly effective for creative writing tasks, particularly psychological thrillers and descriptive storytelling. It aims to produce less sloppy and more vivid, visceral sensory details in its generated text.

Loading preview...