hwanhe/Big_Minirecord02
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:8kLicense:apache-2.0Architecture:Transformer Open Weights Cold

The hwanhe/Big_Minirecord02 is a 7 billion parameter language model with an 8192-token context length. This model is a base model, indicating it is suitable for further fine-tuning or specific applications where a foundational understanding of language is required. Its design focuses on providing a solid linguistic foundation for various downstream tasks.

Loading preview...