silvercoder67/Mistral-7b-instruct-v0.2-summ-sft-e2m
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:8kPublished:Jan 22, 2024License:cc-by-nc-4.0Architecture:Transformer Open Weights Cold
The silvercoder67/Mistral-7b-instruct-v0.2-summ-sft-e2m is a 7 billion parameter instruction-tuned causal language model based on the Mistral architecture, developed by silvercoder67. This model is designed for general text generation tasks, leveraging its 8192 token context length for coherent and extended outputs. Its primary application is expected to be in various natural language processing scenarios requiring instruction-following capabilities.
Loading preview...