Danielbrdz/Barcenas-13b
Danielbrdz/Barcenas-13b is a 13 billion-parameter language model based on the Llama 2 architecture, developed by Danielbrdz. It was trained using the garage-bAInd/Open-Platypus dataset, making it suitable for general-purpose text generation and understanding tasks. This model offers a 4096-token context length, providing a solid foundation for various natural language processing applications.
Loading preview...
Barcenas-13b Overview
Danielbrdz/Barcenas-13b is a 13 billion-parameter large language model built upon the Llama 2 13b architecture. This model was developed by Danielbrdz and trained using an Nvidia Tesla A100 GPU.
Key Training Details
- Base Model: Llama 2 13b
- Training Dataset: garage-bAInd/Open-Platypus
- Hardware: Nvidia Tesla A100
Intended Use Cases
Given its foundation on Llama 2 and training on the Open-Platypus dataset, Barcenas-13b is well-suited for a variety of general natural language processing tasks. It can be applied to applications requiring text generation, summarization, question answering, and conversational AI, particularly where a 13 billion-parameter model offers a balance between performance and computational efficiency.