erfanzar/MaticGPT
TEXT GENERATIONConcurrency Cost:1Model Size:1.1BQuant:BF16Ctx Length:2kPublished:Jan 18, 2024License:mitArchitecture:Transformer0.0K Open Weights Warm

erfanzar/MaticGPT is a 1.1 billion parameter language model, finetuned by erfanzar using EasyDel on a custom UltraChat dataset. This model is designed for a wide range of Natural Language Processing (NLP) tasks, including text classification, sentiment analysis, language translation, and question answering. It utilizes the Llama2 prompting method to maintain context and generate coherent responses over a 2048-token context length.

Loading preview...

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p