TigerResearch/tigerbot-70b-base-v2
TEXT GENERATIONConcurrency Cost:4Model Size:70BQuant:FP8Ctx Length:8kPublished:Nov 17, 2023License:apache-2.0Architecture:Transformer0.0K Open Weights Cold
TigerResearch/tigerbot-70b-base-v2 is a 70 billion parameter foundational large language model developed by TigerResearch. This base model serves as a robust starting point for building custom LLMs, offering a strong general-purpose language understanding capability. With a context length of 8192 tokens, it is designed to be a versatile foundation for various downstream applications and fine-tuning tasks.
Loading preview...