PharynxAI/finetuned_Maghalaya_tripura_19-24_merged
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kLicense:apache-2.0Architecture:Transformer Open Weights Cold

PharynxAI/finetuned_Maghalaya_tripura_19-24_merged is an 8 billion parameter Llama 3.1 instruction-tuned language model developed by PharynxAI. This model was fine-tuned using Unsloth and Huggingface's TRL library, enabling 2x faster training. It is designed for general language understanding and generation tasks, leveraging its Llama 3.1 base architecture and 32768 token context length.

Loading preview...