abeiler/AlphaRep
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kArchitecture:Transformer Cold

The abeiler/AlphaRep model is a fine-tuned version of Meta's Llama-2-7b-hf, a 7 billion parameter causal language model. This model was trained with a learning rate of 0.0001 over 1 epoch, utilizing Adam optimizer. Further details on its specific capabilities, training dataset, and intended uses are not provided in the available documentation.

Loading preview...