maywell/Synatra-7B-v0.3-QA
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Dec 22, 2023License:cc-by-sa-4.0Architecture:Transformer0.0K Open Weights Cold
maywell/Synatra-7B-v0.3-QA is a 7 billion parameter model with a 4096 token context length, fine-tuned by maywell. This model has been trained for two epochs on a Korean Wiki QA dataset, making it specialized for question-answering tasks based on Wikipedia content. While its general capabilities may be reduced, it is intended for use as a component in Mixture-of-Experts (MoE) architectures.
Loading preview...