AdaptLLM/law-chat
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Dec 9, 2023License:llama2Architecture:Transformer0.0K Open Weights Cold

AdaptLLM/law-chat is a 7 billion parameter LLaMA-2-Chat-7B based model developed by AdaptLLM, specifically fine-tuned for legal domain applications. It leverages a novel reading comprehension method for continued pre-training on domain-specific corpora, enhancing its ability to answer legal questions. This model is designed to perform effectively in legal contexts, competing with much larger domain-specific models.

Loading preview...