saschka882/gst-copywriter-v1
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Apr 7, 2026License:apache-2.0Architecture:Transformer Open Weights Cold
The saschka882/gst-copywriter-v1 is a 7 billion parameter instruction-tuned causal language model, fine-tuned from Mistral-7B-Instruct-v0.3 by saschka882. This model is optimized for specific, though currently unspecified, copywriting tasks, leveraging its Mistral-7B base for efficient text generation within a 4096-token context window. Its development focuses on specialized applications, as indicated by its fine-tuning on an unknown dataset.
Loading preview...