Shishir1807/M1_llama
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kArchitecture:Transformer Cold

Shishir1807/M1_llama is a 7 billion parameter causal language model fine-tuned from the Meta Llama-2-7b-hf base model using H2O LLM Studio. This model is designed for general text generation tasks, leveraging the Llama 2 architecture for conversational and instructional applications. It is suitable for deployment on GPU-equipped machines, supporting quantization for efficient inference.

Loading preview...