vikash06/doctorLLM10k
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 4, 2024License:mitArchitecture:Transformer0.0K Open Weights Cold

vikash06/doctorLLM10k is a 7 billion parameter language model with a 4096 token context length, fine-tuned for generating creative medical responses. This model is specifically designed to answer questions and instructions requiring general medical knowledge, without needing external search. Its primary application is in scenarios demanding nuanced and contextually appropriate medical text generation.

Loading preview...