TheFinAI/Fin-o1-8B
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:May 15, 2025License:apache-2.0Architecture:Transformer0.0K Open Weights Warm
Fin-o1-8B is an 8 billion parameter language model developed by TheFinAI, fine-tuned from Qwen3-8B. It specializes in financial reasoning tasks, leveraging SFT and GRPO training on a comprehensive financial dataset derived from FinQA, TATQA, and other benchmarks. This model is optimized to enhance performance in specific financial mathematical reasoning applications.
Loading preview...
Popular Sampler Settings
Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.
temperature
–
top_p
–
top_k
–
frequency_penalty
–
presence_penalty
–
repetition_penalty
–
min_p
–