typhoon-ai/llama3.1-typhoon2-70b-instruct
TEXT GENERATIONConcurrency Cost:4Model Size:70BQuant:FP8Ctx Length:32kPublished:Dec 15, 2024License:llama3.1Architecture:Transformer0.0K Warm

scb10x/llama3.1-typhoon2-70b-instruct is a 70 billion parameter instruction-tuned decoder-only large language model developed by scb10x, based on the Llama3.1 architecture. Optimized for Thai language performance, it excels in instruction-following, function calling, and specific domains like math and coding in both Thai and English. This model features a 90k context length, making it suitable for applications requiring extensive contextual understanding and generation in a bilingual setting.

Loading preview...

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p