eekay/Llama-3.1-8B-Instruct-dog-numbers-ft
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Feb 7, 2026Architecture:Transformer Cold

The eekay/Llama-3.1-8B-Instruct-dog-numbers-ft model is an 8 billion parameter instruction-tuned language model based on the Llama 3.1 architecture. Developed by eekay, this model has a context length of 32768 tokens. Its specific fine-tuning for "dog-numbers" suggests a specialized application, likely involving numerical data related to dogs or similar domain-specific tasks, differentiating it from general-purpose instruction models.

Loading preview...