Kukedlc/NeuralLLaMa-3-8b-ORPO-v0.4
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:8kPublished:May 16, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Warm
Kukedlc/NeuralLLaMa-3-8b-ORPO-v0.4 is an 8 billion parameter language model with an 8192 token context length. This model is a fine-tuned variant, likely based on the LLaMa 3 architecture, utilizing the ORPO (Odds Ratio Preference Optimization) method. Its specific differentiators and primary use cases are not detailed in the provided model card, indicating a general-purpose language model. Further information is needed to identify its specialized strengths or applications.
Loading preview...