aipib/karasu-lora-jp-qa-chat
TEXT GENERATIONConcurrency Cost:1Model Size:1.1BQuant:BF16Ctx Length:2kPublished:Apr 24, 2024Architecture:Transformer Warm
aipib/karasu-lora-jp-qa-chat is a 1.1 billion parameter language model fine-tuned using the LoRA method on an original Japanese Q&A dataset. This model is based on a merge of lightblue/karasu-1.1B and yuiseki/karasu-sake-qa-v0.1, optimized for question-answering tasks. It is particularly useful for Retrieval Augmented Generation (RAG) systems that utilize reference input files, providing relevant answers in Japanese.
Loading preview...