Surzo/llama-2-7b-ssc
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 20, 2026License:mitArchitecture:Transformer Open Weights Cold

Surzo/llama-2-7b-ssc is a 7 billion parameter language model based on the Llama 2 architecture. This model is specifically fine-tuned for short story completion, making it highly effective for creative writing tasks that involve extending narratives. It processes inputs with a context length of 4096 tokens, optimized for generating coherent and contextually relevant story continuations.

Loading preview...