Sakshi1307/llama-2-7b-Sakshi
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kLicense:llama2Architecture:Transformer Open Weights Cold

Sakshi1307/llama-2-7b-Sakshi is a Llama 2-based language model, fine-tuned specifically on the FindSUM dataset. This model is designed for tasks related to summarization, leveraging the Llama 2 architecture for efficient processing. Its primary application is in generating concise summaries from input texts, making it suitable for information extraction and content condensation.

Loading preview...