RefalMachine/llm_test_raw
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kArchitecture:Transformer Cold

RefalMachine/llm_test_raw is a 7 billion parameter language model fine-tuned from TheBloke/Llama-2-7B-fp16. This model was trained for one epoch, achieving a validation loss of 2.0955 and an accuracy of 0.5405 on its evaluation set. Its primary characteristic is its foundational Llama-2 architecture, with specific performance metrics provided from its fine-tuning process.

Loading preview...