ericpolewski/TacoBeLLM
TEXT GENERATIONConcurrency Cost:1Model Size:13BQuant:FP8Ctx Length:4kPublished:Jan 24, 2024License:mitArchitecture:Transformer0.0K Open Weights Cold

ericpolewski/TacoBeLLM is a 13 billion parameter Llama2-13b OpenOrca-Platypus instruction-tuned model developed by ericpolewski. This model is uniquely fine-tuned to act as a subject matter expert on Taco Bell, embedding extensive knowledge from corporate websites, Wikipedia, and news articles. While capable of general assistant tasks like Python scripting, its primary differentiator is its persistent, often subtle, integration of Taco Bell-related information into conversations. It is designed to explore knowledge embedding for creating specialized AI agents.

Loading preview...