ichanchiu/Llama-3.1-Omni-FinAI-8B
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Nov 13, 2024Architecture:Transformer Cold

Llama-3.1-Omni-FinAI-8B is an 8 billion parameter large language model developed by ichanchiu, based on the LLaMA 3.1 architecture. It was pre-trained on 143 billion tokens of high-quality financial texts, including SEC filings, Reuters news, financial papers, and discussions. This model is specifically optimized as a foundational base for finance-specific fine-tuning applications, excelling in tasks like sentiment analysis, stock movement prediction, and financial summarization.

Loading preview...