yuiseki/tinyllama-coder-math-ja-wikipedia-v0.1
TEXT GENERATIONConcurrency Cost:1Model Size:1.1BQuant:BF16Ctx Length:2kPublished:Mar 29, 2024Architecture:Transformer0.0K Warm
The yuiseki/tinyllama-coder-math-ja-wikipedia-v0.1 is a 1.1 billion parameter language model with a 2048 token context length. This model is based on the TinyLlama architecture and is specifically fine-tuned for tasks involving coding, mathematics, and Japanese Wikipedia content. Its primary strength lies in processing and generating text related to these specialized domains, making it suitable for focused applications.
Loading preview...