TareksGraveyard/Thespian-LLaMa-70B is a 70 billion parameter language model created by TareksGraveyard, built upon the LLaMa architecture. This model is a merge of several Llama-based models, including SicariusSicariiStuff/Negative_LLAMA_70B and Sao10K/L3.3-70B-Euryale-v2.3, using the della merge method with nbeerbower/Llama-3.1-Nemotron-lorablated-70B as its base. It is designed to combine the strengths of its constituent models, offering a broad range of general-purpose language generation capabilities with a 32768 token context length.
No reviews yet. Be the first to review!