DILAB-HYU/DialRet
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 27, 2025License:mitArchitecture:Transformer0.0K Open Weights Cold
DialRet is a 7 billion parameter dialogue-specific language model developed by DILAB-HYU, built upon the Llama-1 base architecture. It is designed to enhance dialogue retention and understanding across multi-session conversations by leveraging long-context LMs and instruction-tuning across eight diverse dialogue tasks. This model excels in maintaining context and improving the quality of multi-session dialogues without relying on explicit memory modules.
Loading preview...