ayah-kamal/llama-2-7b-elsevier-shapeless-sentences
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kArchitecture:Transformer Cold
The ayah-kamal/llama-2-7b-elsevier-shapeless-sentences model is a Llama-2-7b-based language model developed by ayah-kamal. It was trained using 4-bit quantization with the bitsandbytes library, specifically employing nf4 quantization and float16 compute dtype. This model is likely optimized for tasks related to processing or generating text in a specific, potentially 'shapeless sentence' format, possibly derived from Elsevier content, though its primary use case is not explicitly detailed in the provided information.
Loading preview...