eekay/Llama-3.1-8B-Instruct-owl-numbers-ft
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Feb 7, 2026Architecture:Transformer Cold

The eekay/Llama-3.1-8B-Instruct-owl-numbers-ft is an 8 billion parameter instruction-tuned language model, likely based on the Llama 3.1 architecture, with a 32768 token context length. This model is fine-tuned for specific tasks, indicated by "owl-numbers-ft," suggesting an optimization for numerical reasoning or data processing. Its primary use case is likely in applications requiring precise handling and understanding of numerical information within a conversational context.

Loading preview...