KnutJaegersberg/Deacon-1b
TEXT GENERATIONConcurrency Cost:1Model Size:1.1BQuant:BF16Ctx Length:2kPublished:Dec 3, 2023License:cc-by-nc-4.0Architecture:Transformer0.0K Open Weights Warm

KnutJaegersberg/Deacon-1b is a 1.1 billion parameter causal language model, fine-tuned from appvoid/palmer-001. This model was fine-tuned for 3 epochs using Neftune, demonstrating general language understanding capabilities. It achieves an average score of 35.21 on the Open LLM Leaderboard, with notable performance on HellaSwag and Winogrande benchmarks.

Loading preview...