4bit/medllama2_7b
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kLicense:mitArchitecture:Transformer Open Weights Cold

Medllama2_7b is a 7 billion parameter language model developed by 4bit, fine-tuned for conversational tasks. This model specializes in medical question-answering, leveraging the Medical Meadow MedQA dataset. With a 4096-token context length, it is optimized for generating medically relevant responses and assisting in healthcare-related inquiries.

Loading preview...