Evan768/testEvan
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Dec 26, 2024License:llama2Architecture:Transformer Open Weights Cold
The Evan768/testEvan model is a 7 billion parameter causal language model, fine-tuned from Meta's Llama-2-7b-chat-hf architecture. This model was trained with a learning rate of 2e-05 over 3 epochs. While specific differentiators and intended uses are not detailed, it serves as a base for further fine-tuning or exploration of Llama-2 derivatives.
Loading preview...