mvswaroop/finetuned_llama3.2_grok_data
TEXT GENERATIONConcurrency Cost:1Model Size:3.2BQuant:BF16Ctx Length:32kPublished:Mar 5, 2025Architecture:Transformer Warm

The mvswaroop/finetuned_llama3.2_grok_data model is a 3.2 billion parameter language model with a 32768 token context length. Developed by mvswaroop, this model is a fine-tuned variant of the Llama 3.2 architecture. Its specific fine-tuning objective and primary use case are not detailed in the provided information, suggesting it is a general-purpose language model or its specialization is yet to be specified.

Loading preview...