mlabonne/TwinLlama-3.1-8B
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Jul 31, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Warm
TwinLlama-3.1-8B is an 8 billion parameter Llama-based model developed by mlabonne, specifically trained to function as a digital twin. It imitates the writing style and knowledge base of its creators, mlabonne, Paul Iusztin, and Alex Vesa, drawing from their articles. This model is optimized for generating content in a specific authorial voice, making it suitable for personalized content creation and stylistic replication tasks.
Loading preview...