Kukedlc/NeuralLLaMa-3-8b-ORPO-v0.3
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:8kPublished:May 14, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Warm

Kukedlc/NeuralLLaMa-3-8b-ORPO-v0.3 is an 8 billion parameter language model based on the LLaMa-3 architecture, fine-tuned using the ORPO method. It features an 8192-token context length and demonstrates strong performance across various benchmarks, including reasoning and common sense tasks. This model is particularly noted for its fluent, clear, and precise Spanish language capabilities, making it suitable for advanced conversational AI applications in Spanish.

Loading preview...