KnutJaegersberg/Deacon-1b is a 1.1 billion parameter causal language model, fine-tuned from appvoid/palmer-001. This model was fine-tuned for 3 epochs using Neftune, demonstrating general language understanding capabilities. It achieves an average score of 35.21 on the Open LLM Leaderboard, with notable performance on HellaSwag and Winogrande benchmarks.
No reviews yet. Be the first to review!