Khyatimirani/pcos-fertility-llama3-8b
TEXT GENERATIONConcurrency Cost:1Model Size:3.2BQuant:BF16Ctx Length:32kPublished:Feb 5, 2026License:apache-2.0Architecture:Transformer Open Weights Warm
Khyatimirani/pcos-fertility-llama3-8b is a 3.2 billion parameter LlamaForCausalLM model, fine-tuned by Khyatimirani from unsloth/llama-3.2-3b-instruct. This model was trained using Unsloth and Huggingface's TRL library, enabling 2x faster training. It features a context length of 32768 tokens and is designed for causal language modeling tasks, likely specialized for applications related to PCOS and fertility given its naming convention.
Loading preview...