abeiler/NumAndAlphaInstruct-75-25-500K
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kArchitecture:Transformer Cold
The abeiler/NumAndAlphaInstruct-75-25-500K is a 7 billion parameter instruction-tuned model, fine-tuned from meta-llama/Llama-2-7b-hf. This model was trained with a learning rate of 0.0001 over 1 epoch, utilizing Adam optimizer. Its specific differentiators and primary use cases are not detailed in the provided information.
Loading preview...