uhhlt/story-emb
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Oct 24, 2024License:mitArchitecture:Transformer0.0K Open Weights Cold
uhhlt/story-emb is a 7 billion parameter embedding model, based on the intfloat/e5-mistral-7b-instruct architecture, specifically fine-tuned for generating narrative-focused representations of fictional stories. This model excels at story retrieval tasks, allowing users to find stories with similar narrative structures. It was developed by Hatzel and Biemann, and is optimized for understanding and comparing narrative content.
Loading preview...