Creamory/medical-llama-3.1-8b-en-merged is an 8 billion parameter language model with a 32768 token context length. This model is deprecated, with development continuing on Creamory/medical-llama-3.1-8b-final and Creamory/medical-llama-3.1-8b-final-gguf. It was intended for medical applications, leveraging the Llama 3.1 architecture.
Loading preview...
Model Overview
This model, Creamory/medical-llama-3.1-8b-en-merged, is an 8 billion parameter language model built on the Llama 3.1 architecture, designed with a substantial context length of 32768 tokens. It was developed by Creamory with a focus on medical applications.
Current Status
This model is deprecated. Users are advised to transition to its successors for continued development and improved performance. The active and maintained versions are:
- Creamory/medical-llama-3.1-8b-final: The primary successor model.
- Creamory/medical-llama-3.1-8b-final-gguf: A GGUF quantized version suitable for local inference and specific deployment scenarios.
Intended Use
While deprecated, this model was initially aimed at tasks within the medical domain, leveraging its large parameter count and context window for potentially complex medical text analysis or generation. For any new projects or ongoing work, please refer to the updated models mentioned above.