FINGU-AI/FinguAI-Chat-v1
TEXT GENERATIONConcurrency Cost:1Model Size:0.6BQuant:BF16Ctx Length:32kPublished:Mar 21, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Loading
FinguAI-Chat-v1 by Grinda AI Inc. is a 0.6 billion parameter causal language model, fine-tuned from Qwen1.5-0.5B-Chat, with a 32768 token context length. It specializes in finance, investment, and legal frameworks across English, Korean, and Japanese, aiming to enhance language proficiency and provide insights into global financial markets and regulatory landscapes. The model is designed to equip users with knowledge for roles in investment banking, corporate finance, asset management, and regulatory compliance.
Loading preview...
Popular Sampler Settings
Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.
temperature
–
top_p
–
top_k
–
frequency_penalty
–
presence_penalty
–
repetition_penalty
–
min_p
–