layai/syn-youtube-vanilla

TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:8kPublished:Apr 10, 2026Architecture:Transformer Cold

The layai/syn-youtube-vanilla model is an 8 billion parameter causal language model fine-tuned from Meta's Llama-3-8B architecture. It was trained with a context length of 8192 tokens, achieving a loss of 3.3634 and an accuracy of 0.5275 on its evaluation set. This model is a general-purpose language model, with specific differentiators and intended uses requiring further information from the developer.

Loading preview...

Model Overview

layai/syn-youtube-vanilla is an 8 billion parameter language model, fine-tuned from the robust Meta-Llama-3-8B architecture. It was trained using a learning rate of 5e-05 over 3 epochs, with a total batch size of 160, utilizing an Adam optimizer. The model achieved a loss of 3.3634 and an accuracy of 0.5275 on its evaluation set.

Key Training Details

  • Base Model: Meta-Llama-3-8B
  • Parameters: 8 Billion
  • Context Length: 8192 tokens
  • Learning Rate: 5e-05
  • Optimizer: Adam with betas=(0.9, 0.999) and epsilon=1e-08
  • Epochs: 3.0
  • Evaluation Metrics: Loss: 3.3634, Accuracy: 0.5275

Current Status

Further details regarding the specific dataset used for fine-tuning, intended uses, limitations, and comprehensive training and evaluation data are currently not provided. Developers should exercise caution and conduct their own evaluations for specific applications.