abhirajs005/llama3-cardio-fhir-v1

TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:8kPublished:Apr 22, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

The abhirajs005/llama3-cardio-fhir-v1 is an 8 billion parameter Llama 3 model, fine-tuned by abhirajs005 from unsloth/llama-3-8b-bnb-4bit. This model was trained using Unsloth and Huggingface's TRL library for accelerated fine-tuning. It is specifically optimized for applications requiring a Llama 3 architecture with 8192 tokens context length, offering enhanced performance for its specialized domain.

Loading preview...

Model Overview

The abhirajs005/llama3-cardio-fhir-v1 is an 8 billion parameter Llama 3 model, fine-tuned by abhirajs005. It is based on the unsloth/llama-3-8b-bnb-4bit base model and utilizes the Unsloth library in conjunction with Huggingface's TRL library for efficient and accelerated training.

Key Characteristics

  • Architecture: Llama 3 family.
  • Parameter Count: 8 billion parameters.
  • Base Model: Fine-tuned from unsloth/llama-3-8b-bnb-4bit.
  • Training Efficiency: Leverages Unsloth for 2x faster fine-tuning.
  • Context Length: Supports an 8192-token context window.

Intended Use Cases

This model is suitable for applications that benefit from a Llama 3 architecture with an 8B parameter count and optimized training. Its fine-tuning process suggests potential specialization, making it a strong candidate for tasks where the base Llama 3 capabilities are enhanced by the specific fine-tuning data (though the specific domain is not detailed in the provided README). Developers looking for an efficiently trained Llama 3 variant should consider this model.