Models

3,519
1B32Kllama32-1b
Warm

LuckyLukke/negotio-1B-REFUEL-1

0
·
2
1B32Kllama32-1b
Warm

Grogros/dmWM-llama-3.2-1B-Instruct-HarmData-Al4-OWT-Ref-d4-a0.25_v1

0
·
2
1B32Kllama32-1b
Warm

Grogros/Llama-3.2-1B-OurInstruct-ce-Alpaca-3.0-AlpacaRefuseSmooth

0
·
2
1B32Kllama32-1b
Warm

Mattia2700/Llama-3.2-1B_ClinicalWhole_5e-05_constant_512_flattening

0
·
2
1B32Kllama32-1b
Warm

Victoriayu/beeyeah-weight-0.5-1e-6

0
·
2
1B32Kllama32-1b
Warm

Grogros/Grogros-Llama-3.2-1B-Instruct-IFP-Al4

0
·
2
1B32Kllama32-1b
Warm

Muadil/Llama-3.2-1B-Instruct_sum_KTO_10k_1_3ep_4bit

0
·
2
1B32Kllama32-1b
Warm

halcyon-llm/Llama-halcyon-1B-token-instruct-checkpoint-10240

0
·
2
1B32Kllama32-1b
Warm

jiinking/10_layer_GQA4_llama_model

0
·
2
1B32Kllama32-1b
Warm

Mattia2700/Llama-3.2-1B_AllDataSources_it.layer1_NoQuant_16_32_0.01_16CLINICALe3c-sentences_tag

0
·
2
1B32Kllama32-1b
Warm

jiinking/14_layer_GQA4_llama_model

0
·
2
1B32Kllama32-1b
Warm

keithdrexel/unsloth-llama-3.2-1b-tldr-unsloth-dpo_mid_checkpoint_trl

0
·
2
1B32Kllama32-1b
Warm

open-unlearning/pos_tofu_Llama-3.2-1B-Instruct_retain90_forget10_bio_lr5e-05_wd0.01_epoch5

0
·
2
1B32Kllama32-1b
Warm

jiinking/15_layer_GQA4_llama_model

0
·
2
1B32Kllama32-1b
Warm

Muadil/Llama-3.2-1B-Instruct_sum_PPO_Skywork_10k_1_2ep

0
·
2
1B32Kllama32-1b
Warm

Grogros/Grogros-Llama-3.2-1B-Instruct-SFP-Al4

0
·
2
3B32Kllama32-3b
Warm

kwoncho/Llama-3.2-3B-KO-EN-Translation

1
·
2
3B32Kllama32-3b
Warm

mm2137/m30

0
·
2
3B32Kllama32-3b
Warm

QuyXuan/documents-master-3B

0
·
2
3B32Kllama32-3b
Warm

juhw/q487

0
·
2