Lili85/Llama2-7BCoQA-full
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Apr 7, 2026Architecture:Transformer Cold
Lili85/Llama2-7BCoQA-full is a 7 billion parameter Llama-2-based language model, fine-tuned from meta-llama/Llama-2-7b-hf. This model is specifically trained for conversational question answering (CoQA) tasks, leveraging the TRL framework for supervised fine-tuning. It is designed to generate coherent and contextually relevant responses in interactive dialogue settings, making it suitable for applications requiring nuanced conversational understanding.
Loading preview...