EEG123/subject1-test1
TEXT GENERATIONConcurrency Cost:1Model Size:1BQuant:BF16Ctx Length:32kLicense:llama3.2Architecture:Transformer Cold

EEG123/subject1-test1 is a 1 billion parameter language model fine-tuned from meta-llama/Llama-3.2-3B. This model is specifically adapted using the EEG123/DE_subject_1 dataset, achieving a loss of 1.5612 on its evaluation set. It is designed for tasks related to its specialized training data, offering focused performance within that domain. The model maintains a context length of 32768 tokens.

Loading preview...