lightblue/openorca_stx
TEXT GENERATIONConcurrency Cost:1Model Size:13BQuant:FP8Ctx Length:4kPublished:Sep 12, 2023License:llama2Architecture:Transformer0.0K Open Weights Cold

The lightblue/openorca_stx is a 13 billion parameter QLoRA fine-tuned model by Lightblue, based on Open-Orca/OpenOrcaxOpenChat-Preview2-13B, with a 4096 token context length. It specializes in Japanese Closed Question Answering, trained on a diverse dataset including SNOW, TyDiQA (Ja), and XLSUM (Ja). This model demonstrates improved performance on Japanese QA benchmarks like JSQuAD, making it suitable for Japanese NLP tasks requiring precise information extraction from provided text.

Loading preview...