marcelbinz/Llama-3.1-Minitaur-8B
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Dec 18, 2024License:llama3.1Architecture:Transformer0.0K Warm

marcelbinz/Llama-3.1-Minitaur-8B is an 8 billion parameter language model, serving as a smaller variant of the Llama-3.1-Centaur-70B model. This model is designed for general language tasks, leveraging the Llama-3.1 architecture. With a 32768 token context length, it offers a balance of performance and efficiency for various applications.

Loading preview...