ermiaazarkhalili/Llama-3.1-8B-Instruct_Function_Calling_xLAM
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Jul 31, 2025License:llama3.1Architecture:Transformer Cold

The ermiaazarkhalili/Llama-3.1-8B-Instruct_Function_Calling_xLAM is an 8 billion parameter language model developed by ermiaazarkhalili, fine-tuned from Meta's Llama-3.1-8B-Instruct. This model is specifically optimized for function calling tasks, having been trained using Supervised Fine-Tuning (SFT) with LoRA adapters on the Salesforce/xlam-function-calling-60k dataset. It features a 2,048 token context length and is primarily intended for research, educational purposes, and prototyping conversational AI with function calling capabilities.

Loading preview...