Models

3,519
1B32Kllama32-1b
Warm

KSU-HW-SEC/llama1B_OB100

0
·
2
1B32Kllama32-1b
Warm

Grogros/dmWM-llama-3.2-1B-Instruct-KGWB-OWT_WMBoundary-OWT-WB-v3

0
·
2
1B32Kllama32-1b
Warm

VictLee/Llama-3.2-1B-Instruct-terapeutico

0
·
2
1B32Kllama32-1b
Warm

Muadil/Llama-3.2-1B-Instruct_sum_DPO_10k_1_3ep_4bit

0
·
2
1B32Kllama32-1b
Warm

Mattia2700/Llama-3.2-1B_AllDataSources_it.layer1_NoQuant_32_32_0.05_16CLINICALe3c-sentences_tag

0
·
2
1B32Kllama32-1b
Warm

Mattia2700/Llama-3.2-1B_ClinicalWhole_it.layer1_NoQuant_64_16_0.01_16CLINICALe3c-sentences_tag

0
·
2
1B32Kllama32-1b
Warm

Grogros/dmWM-llama-3.2-1B-Instruct-OWTWM-DistillationWM-OWTWM2-wmToken-d4-10percent

0
·
2
1B32Kllama32-1b
Warm

manav-glean/llama3.2-1b-neuspell-5epochs

0
·
2
1B32Kllama32-1b
Warm

Muadil/Llama-3.2-1B-Instruct_sum_PPO_Skywork_177k_2_1ep

0
·
2
1B32Kllama32-1b
Warm

jiinking/12_layer_MQA_llama_model

0
·
2
1B32Kllama32-1b
Warm

friendshipkim/1b_instruct

0
·
2
1B32Kllama32-1b
Warm

jiinking/10_first_MQA_llama_model

0
·
2
1B32Kllama32-1b
Warm

krishna195/third_fully_merged

0
·
2
1B32Kllama32-1b
Warm

xw17/Llama-3.2-1B-Instruct_finetuned_1_default

0
·
2
1B32Kllama32-1b
Warm

peteparker456/translator-llama

0
·
2
1B32Kllama32-1b
Warm

omsr/llama-31-hhrlhf-squad-rlhf-policy-model

0
·
2
1B32Kllama32-1b
Warm

Mattia2700/Llama-3.2-1B-Instruct_AllDataSources_8e-06_constant_512

0
·
2
1B32Kllama32-1b
Warm

Victoriayu/beeyeah-dpo-0.1-0.00001

0
·
2
1B32Kllama32-1b
Warm

marcuscedricridia/Mixmix-LlaMAX3.2-1B-Merge

0
·
2
1B32Kllama32-1b
Warm

Mattia2700/Llama-3.2-1B_ClinicalWhole_it.layer1_NoQuant_32_32_0.01_16CLINICALe3c-sentences_tag

0
·
2