930 results
eval_name
stringlengths
12
111
Precision
stringclasses
3 values
Type
stringclasses
7 values
T
stringclasses
7 values
Weight type
stringclasses
2 values
Architecture
stringclasses
64 values
Model
stringlengths
355
689
fullname
stringlengths
4
102
Model sha
stringlengths
0
40
Average ⬆️
float64
0.74
52.1
Hub License
stringclasses
27 values
Hub ❤️
int64
0
6.09k
#Params (B)
float64
-1
141
Available on the hub
bool
2 classes
MoE
bool
2 classes
Flagged
bool
2 classes
Chat Template
bool
2 classes
CO₂ cost (kg)
float64
0.04
187
IFEval Raw
float64
0
0.9
IFEval
float64
0
90
BBH Raw
float64
0.22
0.83
BBH
float64
0.25
76.7
MATH Lvl 5 Raw
float64
0
0.71
MATH Lvl 5
float64
0
71.5
GPQA Raw
float64
0.21
0.47
GPQA
float64
0
29.4
MUSR Raw
float64
0.29
0.6
MUSR
float64
0
38.7
MMLU-PRO Raw
float64
0.1
0.73
MMLU-PRO
float64
0
70
Merged
bool
2 classes
Official Providers
bool
2 classes
Upload To Hub Date
stringclasses
525 values
Submission Date
stringclasses
263 values
Generation
int64
0
10
Base Model
stringlengths
4
102
agentlans_Llama3.1-8B-drill_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/agentlans/Llama3.1-8B-drill" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">agentlans/Llama3.1-8B-drill</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/agentlans__Llama3.1-8B-drill-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
agentlans/Llama3.1-8B-drill
0e8ce55d45a029e132ea4a09bdfb8736c7434384
26.724907
1
8.03
false
false
false
true
1.382947
0.76517
76.516975
0.501568
28.791683
0.17145
17.145015
0.267617
2.348993
0.36724
4.704948
0.377576
30.841829
false
false
2024-12-27
2024-12-27
1
agentlans/Llama3.1-8B-drill (Merge)
agentlans_Llama3.1-SuperDeepFuse_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/agentlans/Llama3.1-SuperDeepFuse" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">agentlans/Llama3.1-SuperDeepFuse</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/agentlans__Llama3.1-SuperDeepFuse-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
agentlans/Llama3.1-SuperDeepFuse
4fbcc81c2c9341f72f921bcc6d17523c62e8b099
27.387201
llama3.1
1
8.03
true
false
false
true
1.345605
0.776161
77.616059
0.504854
29.218387
0.182779
18.277946
0.274329
3.243848
0.369875
5.134375
0.377493
30.832595
true
false
2025-01-20
2025-01-21
1
agentlans/Llama3.1-SuperDeepFuse (Merge)
agentlans_Llama3.1-Daredevilish_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/agentlans/Llama3.1-Daredevilish" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">agentlans/Llama3.1-Daredevilish</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/agentlans__Llama3.1-Daredevilish-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
agentlans/Llama3.1-Daredevilish
43e0c3d9792d39b0e41bbe7fa21116751418b7d9
25.570843
llama3.1
1
8.03
true
false
false
true
1.401547
0.629157
62.91573
0.501251
29.20273
0.129154
12.915408
0.301174
6.823266
0.409094
11.603385
0.369681
29.964539
true
false
2025-01-22
2025-01-22
1
agentlans/Llama3.1-Daredevilish (Merge)
agentlans_Llama3.1-SuperDeepFuse-CrashCourse12K_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/agentlans/Llama3.1-SuperDeepFuse-CrashCourse12K" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">agentlans/Llama3.1-SuperDeepFuse-CrashCourse12K</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/agentlans__Llama3.1-SuperDeepFuse-CrashCourse12K-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
agentlans/Llama3.1-SuperDeepFuse-CrashCourse12K
0d38d69d9daeeb59dd837e21e8373a372e3c2226
27.995793
llama3.1
1
8.03
true
false
false
true
1.410392
0.718733
71.873296
0.521551
31.828444
0.180514
18.05136
0.312919
8.389262
0.402646
8.597396
0.363115
29.235003
false
false
2025-01-24
2025-01-24
1
agentlans/Llama3.1-SuperDeepFuse-CrashCourse12K (Merge)
agentlans_Llama3.1-LexiHermes-SuperStorm_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/agentlans/Llama3.1-LexiHermes-SuperStorm" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">agentlans/Llama3.1-LexiHermes-SuperStorm</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/agentlans__Llama3.1-LexiHermes-SuperStorm-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
agentlans/Llama3.1-LexiHermes-SuperStorm
a7e3bdb9308da0a6aa3108b9e14db4e93000be83
29.430987
llama3.1
2
8.03
true
false
false
true
1.238542
0.783455
78.345457
0.526646
32.547493
0.161631
16.163142
0.322987
9.731544
0.39626
8.199219
0.384392
31.599069
true
false
2025-02-12
2025-02-19
1
agentlans/Llama3.1-LexiHermes-SuperStorm (Merge)
agentlans_Llama3.1-Daredevilish-Instruct_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/agentlans/Llama3.1-Daredevilish-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">agentlans/Llama3.1-Daredevilish-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/agentlans__Llama3.1-Daredevilish-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
agentlans/Llama3.1-Daredevilish-Instruct
97acbab4c2412facfda8469521ce3e86baa28e23
29.365204
llama3.1
1
8.03
true
false
false
true
2.914256
0.792597
79.259698
0.523544
32.217512
0.172205
17.220544
0.307047
7.606264
0.391083
7.91875
0.387716
31.968454
true
false
2025-01-22
2025-01-22
1
agentlans/Llama3.1-Daredevilish-Instruct (Merge)
agentlans_Llama-3.2-1B-Instruct-CrashCourse12K_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/agentlans/Llama-3.2-1B-Instruct-CrashCourse12K" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">agentlans/Llama-3.2-1B-Instruct-CrashCourse12K</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/agentlans__Llama-3.2-1B-Instruct-CrashCourse12K-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
agentlans/Llama-3.2-1B-Instruct-CrashCourse12K
9b32ead43c60e8c3b16ee38f223236d3a44f4aa6
13.437968
llama3.2
0
1.236
true
false
false
true
0.776333
0.539506
53.950629
0.35481
9.387918
0.070997
7.099698
0.240772
0
0.321042
1.196875
0.180934
8.992686
false
false
2025-01-05
2025-01-05
0
agentlans/Llama-3.2-1B-Instruct-CrashCourse12K
agentlans_Gemma2-9B-AdvancedFuse_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/agentlans/Gemma2-9B-AdvancedFuse" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">agentlans/Gemma2-9B-AdvancedFuse</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/agentlans__Gemma2-9B-AdvancedFuse-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
agentlans/Gemma2-9B-AdvancedFuse
f7d31619237579b7b473f8ebe87047cf881a33a6
20.43458
gemma
0
9.242
true
false
false
true
3.959093
0.154273
15.427288
0.585937
40.516738
0.100453
10.045317
0.334732
11.297539
0.423083
11.985417
0.400017
33.33518
false
false
2025-01-03
2025-01-21
1
agentlans/Gemma2-9B-AdvancedFuse (Merge)
agentlans_Qwen2.5-0.5B-Instruct-CrashCourse-dropout_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/agentlans/Qwen2.5-0.5B-Instruct-CrashCourse-dropout" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">agentlans/Qwen2.5-0.5B-Instruct-CrashCourse-dropout</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/agentlans__Qwen2.5-0.5B-Instruct-CrashCourse-dropout-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
agentlans/Qwen2.5-0.5B-Instruct-CrashCourse-dropout
09acebcae7826ca66d950641a92a2732d3cf49eb
8.433362
apache-2.0
0
0.494
true
false
false
true
0.918546
0.294883
29.488313
0.331173
7.227868
0.042296
4.229607
0.263423
1.789709
0.334188
1.106771
0.160821
6.757905
false
false
2025-01-01
2025-01-01
1
agentlans/Qwen2.5-0.5B-Instruct-CrashCourse-dropout (Merge)
meta-llama_Llama-3.2-1B_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/meta-llama/Llama-3.2-1B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meta-llama/Llama-3.2-1B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/meta-llama__Llama-3.2-1B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
meta-llama/Llama-3.2-1B
a7c18587d7f473bfea02aa5639aa349403307b54
4.19514
llama3.2
1,700
1.24
true
false
false
false
0.838257
0.147779
14.7779
0.311495
4.36603
0.012085
1.208459
0.228188
0
0.344729
2.557813
0.120346
2.260638
false
true
2024-09-18
2024-09-23
0
meta-llama/Llama-3.2-1B
meta-llama_Llama-3.2-3B_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/meta-llama/Llama-3.2-3B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meta-llama/Llama-3.2-3B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/meta-llama__Llama-3.2-3B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
meta-llama/Llama-3.2-3B
95c102307f55fbd6d18ddf28bfbcb537ffdc2806
8.697823
llama3.2
529
3.213
true
false
false
false
2.013735
0.133741
13.37407
0.390512
14.232665
0.018882
1.888218
0.267617
2.348993
0.357719
3.814844
0.248753
16.528147
false
true
2024-09-18
2024-09-27
0
meta-llama/Llama-3.2-3B
meta-llama_Llama-3.1-70B_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/meta-llama/Llama-3.1-70B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meta-llama/Llama-3.1-70B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/meta-llama__Llama-3.1-70B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
meta-llama/Llama-3.1-70B
f7d3cc45ed4ff669a354baf2e0f05e65799a0bee
26.200216
llama3.1
350
70.554
true
false
false
true
13.601852
0.168438
16.843752
0.626007
46.399413
0.18429
18.429003
0.387584
18.344519
0.457188
16.581771
0.465426
40.602837
false
true
2024-07-14
2024-07-23
0
meta-llama/Llama-3.1-70B
meta-llama_Llama-3.1-8B-Instruct_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/meta-llama/Llama-3.1-8B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meta-llama/Llama-3.1-8B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/meta-llama__Llama-3.1-8B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
meta-llama/Llama-3.1-8B-Instruct
0e9e39f249a16976918f6564b8830bc894c89659
23.763729
llama3.1
3,759
8.03
true
false
false
false
2.106037
0.492171
49.217077
0.508703
29.379192
0.155589
15.558912
0.315436
8.724832
0.397156
8.611198
0.37982
31.091164
false
true
2024-07-18
2025-02-06
1
meta-llama/Meta-Llama-3.1-8B
meta-llama_Llama-3.1-8B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/meta-llama/Llama-3.1-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meta-llama/Llama-3.1-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/meta-llama__Llama-3.1-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
meta-llama/Llama-3.1-8B
d04e592bb4f6aa9cfee91e2e20afa771667e1d4b
14.420865
llama3.1
1,503
8.03
true
false
false
false
1.426487
0.124598
12.459829
0.465959
25.304471
0.06571
6.570997
0.310403
8.053691
0.381188
8.715104
0.32879
25.421099
false
true
2024-07-14
2024-12-07
0
meta-llama/Llama-3.1-8B
meta-llama_Llama-2-7b-hf_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/meta-llama/Llama-2-7b-hf" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meta-llama/Llama-2-7b-hf</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/meta-llama__Llama-2-7b-hf-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
meta-llama/Llama-2-7b-hf
01c7f73d771dfac7d292323805ebc428287df4f9
8.806358
llama2
1,985
6.738
true
false
false
false
1.126189
0.251894
25.189386
0.34962
10.351417
0.017372
1.73716
0.266779
2.237136
0.370062
3.757813
0.186087
9.565233
false
true
2023-07-13
2024-06-12
0
meta-llama/Llama-2-7b-hf
meta-llama_Llama-2-70b-hf_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/meta-llama/Llama-2-70b-hf" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meta-llama/Llama-2-70b-hf</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/meta-llama__Llama-2-70b-hf-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
meta-llama/Llama-2-70b-hf
3aba440b59558f995867ba6e1f58f21d0336b5bb
18.372599
llama2
846
68.977
true
false
false
false
59.242493
0.240678
24.067807
0.547259
35.900062
0.032477
3.247734
0.302852
7.04698
0.412354
9.777604
0.371759
30.195405
false
true
2023-07-11
2024-06-12
0
meta-llama/Llama-2-70b-hf
meta-llama_Meta-Llama-3-8B_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/meta-llama/Meta-Llama-3-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meta-llama/Meta-Llama-3-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/meta-llama__Meta-Llama-3-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
meta-llama/Meta-Llama-3-8B
62bd457b6fe961a42a631306577e622c83876cb6
13.626857
llama3
6,093
8.03
true
false
false
false
1.745137
0.145506
14.550615
0.459791
24.500764
0.045317
4.531722
0.305369
7.38255
0.361406
6.242448
0.320977
24.553044
false
true
2024-04-17
2024-06-12
0
meta-llama/Meta-Llama-3-8B
meta-llama_Llama-2-13b-hf_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/meta-llama/Llama-2-13b-hf" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meta-llama/Llama-2-13b-hf</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/meta-llama__Llama-2-13b-hf-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
meta-llama/Llama-2-13b-hf
5c31dfb671ce7cfe2d7bb7c04375e44c55e815b1
11.065186
llama2
594
13.016
true
false
false
false
2.22476
0.248247
24.824687
0.412562
17.22256
0.015106
1.510574
0.28104
4.138702
0.35375
3.385417
0.237783
15.309176
false
true
2023-07-13
2024-06-12
0
meta-llama/Llama-2-13b-hf
meta-llama_Llama-3.3-70B-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/meta-llama/Llama-3.3-70B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meta-llama/Llama-3.3-70B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/meta-llama__Llama-3.3-70B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
meta-llama/Llama-3.3-70B-Instruct
44.847471
llama3.3
2,169
70.554
true
false
false
true
76.559074
0.899758
89.97582
0.691931
56.561411
0.483384
48.338369
0.328859
10.514541
0.446125
15.565625
0.533162
48.129063
false
true
2024-11-26
2024-12-03
1
meta-llama/Llama-3.3-70B-Instruct (Merge)
meta-llama_Meta-Llama-3-70B_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/meta-llama/Meta-Llama-3-70B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meta-llama/Meta-Llama-3-70B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/meta-llama__Meta-Llama-3-70B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
meta-llama/Meta-Llama-3-70B
b4d08b7db49d488da3ac49adf25a6b9ac01ae338
26.70535
llama3
854
70.554
true
false
false
false
46.814372
0.160319
16.031906
0.646107
48.709813
0.185801
18.58006
0.397651
19.686801
0.451823
16.011198
0.470911
41.212323
false
true
2024-04-17
2024-06-12
0
meta-llama/Meta-Llama-3-70B
meta-llama_Llama-3.1-70B-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/meta-llama/Llama-3.1-70B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meta-llama/Llama-3.1-70B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/meta-llama__Llama-3.1-70B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
meta-llama/Llama-3.1-70B-Instruct
b9461463b511ed3c0762467538ea32cf7c9669f2
43.409948
llama3.1
797
70.554
true
false
false
true
40.221824
0.866885
86.688542
0.691729
55.927992
0.380665
38.066465
0.356544
14.205817
0.458063
17.691146
0.530918
47.879728
false
true
2024-07-16
2024-08-15
1
meta-llama/Meta-Llama-3.1-70B
meta-llama_Llama-3.2-3B-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/meta-llama/Llama-3.2-3B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meta-llama/Llama-3.2-3B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/meta-llama__Llama-3.2-3B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
meta-llama/Llama-3.2-3B-Instruct
276b29ce8303c9b88966a9b32fc75692dce4d8e1
24.204651
llama3.2
1,245
3.213
true
false
false
true
1.927962
0.739316
73.931613
0.461007
24.059186
0.176737
17.673716
0.278523
3.803132
0.352854
1.373437
0.319481
24.38682
false
true
2024-09-18
2024-09-27
0
meta-llama/Llama-3.2-3B-Instruct
meta-llama_Llama-3.2-1B-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/meta-llama/Llama-3.2-1B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meta-llama/Llama-3.2-1B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/meta-llama__Llama-3.2-1B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
meta-llama/Llama-3.2-1B-Instruct
d0a2081ed47e20ce524e8bc5d132f3fad2f69ff0
14.443126
llama3.2
831
1.24
true
false
false
true
0.809809
0.569831
56.983138
0.349685
8.742521
0.070242
7.024169
0.275168
3.355705
0.332854
2.973438
0.168218
7.579787
false
true
2024-09-18
2024-09-23
0
meta-llama/Llama-3.2-1B-Instruct
meta-llama_Llama-2-7b-chat-hf_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/meta-llama/Llama-2-7b-chat-hf" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meta-llama/Llama-2-7b-chat-hf</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/meta-llama__Llama-2-7b-chat-hf-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
meta-llama/Llama-2-7b-chat-hf
f5db02db724555f92da89c216ac04704f23d4590
9.609483
llama2
4,318
6.738
true
false
false
true
1.791393
0.398648
39.864781
0.311355
4.459172
0.019637
1.963746
0.253356
0.447427
0.367552
3.277344
0.1688
7.64443
false
true
2023-07-13
2024-08-30
0
meta-llama/Llama-2-7b-chat-hf
meta-llama_Llama-2-70b-chat-hf_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/meta-llama/Llama-2-70b-chat-hf" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meta-llama/Llama-2-70b-chat-hf</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/meta-llama__Llama-2-70b-chat-hf-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
meta-llama/Llama-2-70b-chat-hf
e9149a12809580e8602995856f8098ce973d1080
13.073696
llama2
2,182
68.977
true
false
false
true
45.79691
0.495792
49.579228
0.304247
4.613767
0.029456
2.945619
0.264262
1.901566
0.368667
3.483333
0.243268
15.918661
false
true
2023-07-14
2024-06-12
0
meta-llama/Llama-2-70b-chat-hf
meta-llama_Meta-Llama-3-70B-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/meta-llama/Meta-Llama-3-70B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meta-llama/Meta-Llama-3-70B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/meta-llama__Meta-Llama-3-70B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
meta-llama/Meta-Llama-3-70B-Instruct
7129260dd854a80eb10ace5f61c20324b472b31c
36.372224
llama3
1,464
70.554
true
false
false
true
36.4783
0.809908
80.990771
0.65467
50.185133
0.244713
24.471299
0.286913
4.9217
0.415365
10.920573
0.520695
46.743868
false
true
2024-04-17
2024-06-12
1
meta-llama/Meta-Llama-3-70B
meta-llama_Meta-Llama-3-8B-Instruct_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/meta-llama/Meta-Llama-3-8B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meta-llama/Meta-Llama-3-8B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/meta-llama__Meta-Llama-3-8B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
meta-llama/Meta-Llama-3-8B-Instruct
e1945c40cd546c78e41f1151f4db032b271faeaa
20.609159
llama3
3,873
8.03
true
false
false
false
1.898947
0.478232
47.82322
0.491026
26.795284
0.09139
9.138973
0.292785
5.704698
0.380542
5.401042
0.359126
28.791741
false
true
2024-04-17
2024-07-08
0
meta-llama/Meta-Llama-3-8B-Instruct
meta-llama_Meta-Llama-3-8B-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/meta-llama/Meta-Llama-3-8B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meta-llama/Meta-Llama-3-8B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/meta-llama__Meta-Llama-3-8B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
meta-llama/Meta-Llama-3-8B-Instruct
e1945c40cd546c78e41f1151f4db032b271faeaa
23.908736
llama3
3,873
8.03
true
false
false
true
0.7975
0.74084
74.083986
0.498871
28.24495
0.086858
8.685801
0.259228
1.230425
0.356823
1.602865
0.366439
29.604388
false
true
2024-04-17
2024-06-12
0
meta-llama/Meta-Llama-3-8B-Instruct
meta-llama_Llama-2-13b-chat-hf_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/meta-llama/Llama-2-13b-chat-hf" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meta-llama/Llama-2-13b-chat-hf</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/meta-llama__Llama-2-13b-chat-hf-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
meta-llama/Llama-2-13b-chat-hf
a2cb7a712bb6e5e736ca7f8cd98167f81a0b5bd8
11.129635
llama2
1,067
13.016
true
false
false
true
1.749139
0.398473
39.847272
0.334274
7.15538
0.013595
1.359517
0.231544
0
0.400729
8.157813
0.19232
10.257831
false
true
2023-07-13
2024-06-12
0
meta-llama/Llama-2-13b-chat-hf
gbueno86_Meta-LLama-3-Cat-Smaug-LLama-70b_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/gbueno86/Meta-LLama-3-Cat-Smaug-LLama-70b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">gbueno86/Meta-LLama-3-Cat-Smaug-LLama-70b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/gbueno86__Meta-LLama-3-Cat-Smaug-LLama-70b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
gbueno86/Meta-LLama-3-Cat-Smaug-LLama-70b
2d73b7e1c7157df482555944d6a6b1362bc6c3c5
38.696133
llama3
1
70.554
true
false
false
true
21.804586
0.807185
80.718494
0.667431
51.508386
0.293807
29.380665
0.327181
10.290828
0.436823
15.002865
0.50748
45.275561
true
false
2024-05-24
2024-06-27
1
gbueno86/Meta-LLama-3-Cat-Smaug-LLama-70b (Merge)
pszemraj_Llama-3-6.3b-v0.1_bfloat16
bfloat16
🟩 continuously pretrained
🟩
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/pszemraj/Llama-3-6.3b-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pszemraj/Llama-3-6.3b-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/pszemraj__Llama-3-6.3b-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
pszemraj/Llama-3-6.3b-v0.1
7000b39346162f95f19aa4ca3975242db61902d7
10.384307
llama3
6
6.3
true
false
false
false
1.628927
0.10439
10.438969
0.419681
18.679996
0.021148
2.114804
0.283557
4.474273
0.390833
6.154167
0.283993
20.443632
false
false
2024-05-17
2024-06-26
1
meta-llama/Meta-Llama-3-8B
jebish7_Llama-3.1-8B-Instruct_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/jebish7/Llama-3.1-8B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">jebish7/Llama-3.1-8B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/jebish7__Llama-3.1-8B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
jebish7/Llama-3.1-8B-Instruct
c31aabd088d023ac58f11e2539eece7fbdce9920
24.022312
llama3.1
0
8.03
true
false
false
false
1.440992
0.505835
50.583452
0.508839
29.191619
0.154834
15.483384
0.321309
9.50783
0.399792
8.507292
0.377743
30.860298
false
false
2025-02-06
2025-02-06
1
meta-llama/Meta-Llama-3.1-8B
NousResearch_Hermes-3-Llama-3.1-70B_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/NousResearch/Hermes-3-Llama-3.1-70B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">NousResearch/Hermes-3-Llama-3.1-70B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/NousResearch__Hermes-3-Llama-3.1-70B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
NousResearch/Hermes-3-Llama-3.1-70B
093242c69a91f8d9d5b8094c380b88772f9bd7f8
38.514771
llama3
107
70.554
true
false
false
true
22.415782
0.766144
76.614383
0.675578
53.765409
0.20997
20.996979
0.361577
14.876957
0.494896
23.428646
0.472656
41.40625
false
true
2024-07-29
2024-08-28
1
meta-llama/Meta-Llama-3.1-70B
ValiantLabs_Llama3.1-8B-Fireplace2_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ValiantLabs/Llama3.1-8B-Fireplace2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ValiantLabs/Llama3.1-8B-Fireplace2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ValiantLabs__Llama3.1-8B-Fireplace2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ValiantLabs/Llama3.1-8B-Fireplace2
be3a5c18b5e8e86a3703df1a8227f784ad2c713c
18.312602
llama3.1
6
8.03
true
false
false
true
0.916234
0.548324
54.8324
0.460982
24.070273
0.058157
5.81571
0.288591
5.145414
0.343302
4.379427
0.240691
15.632388
false
false
2024-07-23
2024-07-25
2
meta-llama/Meta-Llama-3.1-8B
MaziyarPanahi_calme-2.2-llama3-70b_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/MaziyarPanahi/calme-2.2-llama3-70b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MaziyarPanahi/calme-2.2-llama3-70b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MaziyarPanahi__calme-2.2-llama3-70b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
MaziyarPanahi/calme-2.2-llama3-70b
95366b974baedee4d95c1e841bc3d15e94753804
38.140064
llama3
17
70.554
true
false
false
true
21.256547
0.820849
82.084868
0.643543
48.571706
0.239426
23.942598
0.341443
12.192394
0.444573
15.304948
0.520695
46.743868
false
false
2024-04-27
2024-06-26
2
meta-llama/Meta-Llama-3-70B
DeepMount00_Llama-3-8b-Ita_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DeepMount00/Llama-3-8b-Ita" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DeepMount00/Llama-3-8b-Ita</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DeepMount00__Llama-3-8b-Ita-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DeepMount00/Llama-3-8b-Ita
d40847d2981b588690c1dc21d5157d3f4afb2978
26.796816
llama3
24
8.03
true
false
false
true
1.556517
0.75303
75.302974
0.493577
28.077746
0.066465
6.646526
0.305369
7.38255
0.426771
11.679688
0.385223
31.691415
false
false
2024-05-01
2024-06-27
1
meta-llama/Meta-Llama-3-8B
NotASI_FineTome-Llama3.2-3B-1002_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/NotASI/FineTome-Llama3.2-3B-1002" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">NotASI/FineTome-Llama3.2-3B-1002</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/NotASI__FineTome-Llama3.2-3B-1002-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
NotASI/FineTome-Llama3.2-3B-1002
7c8497a24a381e3bfd77bc92e5685442768790d0
16.7624
llama3.2
1
3
true
false
false
true
2.130901
0.54745
54.744966
0.431947
19.520061
0.062689
6.268882
0.250839
0.111857
0.36851
3.963802
0.243684
15.964835
false
false
2024-10-04
2024-10-05
2
meta-llama/Llama-3.2-3B-Instruct
ValiantLabs_Llama3.2-3B-ShiningValiant2_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ValiantLabs/Llama3.2-3B-ShiningValiant2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ValiantLabs/Llama3.2-3B-ShiningValiant2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ValiantLabs__Llama3.2-3B-ShiningValiant2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ValiantLabs/Llama3.2-3B-ShiningValiant2
1336e200485675c9b92baae17831eab17c601803
14.390696
llama3.2
3
3.213
true
false
false
false
3.463997
0.26251
26.251014
0.422593
18.912709
0.082326
8.232628
0.280201
4.026846
0.386646
8.597396
0.282912
20.323582
false
false
2024-09-27
2024-11-05
1
meta-llama/Llama-3.2-3B-Instruct
ValiantLabs_Llama3.1-70B-ShiningValiant2_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ValiantLabs/Llama3.1-70B-ShiningValiant2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ValiantLabs/Llama3.1-70B-ShiningValiant2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ValiantLabs__Llama3.1-70B-ShiningValiant2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ValiantLabs/Llama3.1-70B-ShiningValiant2
55436621ed65f0b79e7c6324b780bd6a18e06c79
36.493184
llama3.1
3
70.554
true
false
false
false
27.99457
0.535535
53.55346
0.673841
52.390969
0.291541
29.154079
0.392617
19.01566
0.468104
18.479688
0.517287
46.365248
false
false
2024-10-30
2024-10-30
2
meta-llama/Meta-Llama-3.1-70B
ValiantLabs_Llama3.1-8B-Enigma_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ValiantLabs/Llama3.1-8B-Enigma" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ValiantLabs/Llama3.1-8B-Enigma</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ValiantLabs__Llama3.1-8B-Enigma-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ValiantLabs/Llama3.1-8B-Enigma
332c99d80f378c77b090745a5aac10f8ab339519
16.625157
llama3.1
10
8.03
true
false
false
false
7.275141
0.268055
26.805543
0.44776
22.012915
0.089124
8.912387
0.287752
5.033557
0.419604
10.217188
0.340924
26.769356
false
false
2024-08-11
2024-10-02
2
meta-llama/Meta-Llama-3.1-8B
win10_Llama-3.2-3B-Instruct-24-9-29_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/win10/Llama-3.2-3B-Instruct-24-9-29" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">win10/Llama-3.2-3B-Instruct-24-9-29</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/win10__Llama-3.2-3B-Instruct-24-9-29-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
win10/Llama-3.2-3B-Instruct-24-9-29
4defb10e2415111abb873d695dd40c387c1d6d57
24.004698
llama3.2
0
3.213
true
false
false
true
1.427211
0.733221
73.322119
0.461423
24.196426
0.170695
17.069486
0.274329
3.243848
0.355521
1.440104
0.322806
24.756206
false
false
2024-09-29
2024-10-11
2
meta-llama/Llama-3.2-3B-Instruct
NotASI_FineTome-Llama3.2-1B-0929_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/NotASI/FineTome-Llama3.2-1B-0929" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">NotASI/FineTome-Llama3.2-1B-0929</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/NotASI__FineTome-Llama3.2-1B-0929-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
NotASI/FineTome-Llama3.2-1B-0929
61c8742238d0cfe68a0a3f61326b84cd6624ad02
9.953181
llama3.2
1
1.236
true
false
false
true
0.930817
0.399072
39.907224
0.324627
5.741405
0.036254
3.625378
0.272651
3.020134
0.34876
2.661719
0.142869
4.763224
false
false
2024-09-29
2024-10-04
2
meta-llama/Llama-3.2-1B-Instruct
ValiantLabs_Llama3.1-8B-ShiningValiant2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ValiantLabs/Llama3.1-8B-ShiningValiant2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ValiantLabs/Llama3.1-8B-ShiningValiant2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ValiantLabs__Llama3.1-8B-ShiningValiant2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ValiantLabs/Llama3.1-8B-ShiningValiant2
6b2b5694a192cb29ad0e4314138affa25b630c0e
23.157281
llama3.1
16
8.03
true
false
false
true
2.445238
0.649565
64.956538
0.477391
26.346119
0.056647
5.664653
0.310403
8.053691
0.390865
7.458073
0.338182
26.464613
false
false
2024-08-06
2024-08-10
2
meta-llama/Meta-Llama-3.1-8B
ValiantLabs_Llama3.1-8B-ShiningValiant2_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ValiantLabs/Llama3.1-8B-ShiningValiant2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ValiantLabs/Llama3.1-8B-ShiningValiant2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ValiantLabs__Llama3.1-8B-ShiningValiant2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ValiantLabs/Llama3.1-8B-ShiningValiant2
6b2b5694a192cb29ad0e4314138affa25b630c0e
15.458036
llama3.1
16
8.03
true
false
false
false
6.528982
0.267806
26.780609
0.442929
21.61815
0.052115
5.21148
0.302013
6.935123
0.395917
10.789583
0.292719
21.413268
false
false
2024-08-06
2024-11-05
2
meta-llama/Meta-Llama-3.1-8B
ValiantLabs_Llama3.2-3B-Esper2_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ValiantLabs/Llama3.2-3B-Esper2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ValiantLabs/Llama3.2-3B-Esper2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ValiantLabs__Llama3.2-3B-Esper2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ValiantLabs/Llama3.2-3B-Esper2
64a2c619a2e1680ab42945fcf5b75a5242cab3a1
10.944295
llama3.2
3
3.213
true
false
false
false
1.477769
0.274975
27.497484
0.380826
13.851733
0.036254
3.625378
0.270134
2.684564
0.354958
4.036458
0.225731
13.970154
false
false
2024-10-03
2024-10-09
1
meta-llama/Llama-3.2-3B-Instruct
ValiantLabs_Llama3.1-8B-Cobalt_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ValiantLabs/Llama3.1-8B-Cobalt" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ValiantLabs/Llama3.1-8B-Cobalt</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ValiantLabs__Llama3.1-8B-Cobalt-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ValiantLabs/Llama3.1-8B-Cobalt
3a69145a2acc1f7f51735aa3ae5d81c090249c65
20.239394
llama3.1
6
8.03
true
false
false
false
2.627763
0.349613
34.961347
0.494677
27.417777
0.126888
12.688822
0.303691
7.158837
0.395948
9.826823
0.364445
29.382757
false
false
2024-08-16
2024-10-02
2
meta-llama/Meta-Llama-3.1-8B
ValiantLabs_Llama3.1-8B-Cobalt_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ValiantLabs/Llama3.1-8B-Cobalt" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ValiantLabs/Llama3.1-8B-Cobalt</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ValiantLabs__Llama3.1-8B-Cobalt-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ValiantLabs/Llama3.1-8B-Cobalt
3a69145a2acc1f7f51735aa3ae5d81c090249c65
25.558664
llama3.1
6
8.03
true
false
false
true
0.938171
0.716835
71.683467
0.49107
27.235483
0.153323
15.332326
0.286074
4.809843
0.35124
4.704948
0.366273
29.585919
false
false
2024-08-16
2024-09-20
2
meta-llama/Meta-Llama-3.1-8B
Weyaxi_Einstein-v8-Llama3.2-1B_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Weyaxi/Einstein-v8-Llama3.2-1B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Weyaxi/Einstein-v8-Llama3.2-1B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Weyaxi__Einstein-v8-Llama3.2-1B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Weyaxi/Einstein-v8-Llama3.2-1B
1edc6abcb8eedd047bc40b79d2d36c3723ff28e2
4.640409
llama3.2
2
1.236
true
false
false
true
0.775849
0.186223
18.622256
0.301843
3.013774
0.000755
0.075529
0.258389
1.118568
0.361781
3.222656
0.116107
1.789672
false
false
2024-09-28
2024-09-30
1
meta-llama/Llama-3.2-1B
cognitivecomputations_dolphin-2.9.4-llama3.1-8b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/cognitivecomputations/dolphin-2.9.4-llama3.1-8b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">cognitivecomputations/dolphin-2.9.4-llama3.1-8b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/cognitivecomputations__dolphin-2.9.4-llama3.1-8b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
cognitivecomputations/dolphin-2.9.4-llama3.1-8b
7b73d1b7760bf9abac168de3d388b30d1ca1a138
7.131861
llama3.1
96
8.03
true
false
false
true
2.637556
0.275724
27.572397
0.352363
8.972089
0.012085
1.208459
0.263423
1.789709
0.323615
0.61849
0.12367
2.630024
false
true
2024-08-04
2024-09-17
1
meta-llama/Meta-Llama-3.1-8B
KingNish_Reasoning-Llama-3b-v0.1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/KingNish/Reasoning-Llama-3b-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">KingNish/Reasoning-Llama-3b-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/KingNish__Reasoning-Llama-3b-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
KingNish/Reasoning-Llama-3b-v0.1
d164caf591c42a4cbc3b21d46493e72fbdbd9de8
20.21238
llama3.2
9
3.213
true
false
false
true
1.35047
0.622463
62.246284
0.434336
19.862451
0.129909
12.990937
0.259228
1.230425
0.31676
2.395052
0.302942
22.549128
false
false
2024-10-10
2024-10-26
1
meta-llama/Llama-3.2-3B-Instruct
ValiantLabs_Llama3.1-8B-Esper2_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ValiantLabs/Llama3.1-8B-Esper2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ValiantLabs/Llama3.1-8B-Esper2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ValiantLabs__Llama3.1-8B-Esper2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ValiantLabs/Llama3.1-8B-Esper2
38f24f2fe90f839acbc57e7530221acf1232e9dc
13.94081
llama3.1
2
8.03
true
false
false
false
1.75353
0.25674
25.673989
0.446987
22.195685
0.058912
5.891239
0.272651
3.020134
0.356073
5.709115
0.290392
21.154699
false
false
2024-10-02
2024-10-09
2
meta-llama/Meta-Llama-3.1-8B
MaziyarPanahi_calme-2.4-llama3-70b_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/MaziyarPanahi/calme-2.4-llama3-70b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MaziyarPanahi/calme-2.4-llama3-70b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MaziyarPanahi__calme-2.4-llama3-70b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
MaziyarPanahi/calme-2.4-llama3-70b
cb03e4d810b82d86e7cb01ab146bade09a5d06d1
32.486225
llama3
14
70.554
true
false
false
true
35.487517
0.502737
50.273718
0.641819
48.397766
0.244713
24.471299
0.339765
11.96868
0.428792
13.098958
0.520362
46.70693
false
false
2024-04-28
2024-06-26
2
meta-llama/Meta-Llama-3-70B
DeepMount00_Llama-3.1-Distilled_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DeepMount00/Llama-3.1-Distilled" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DeepMount00/Llama-3.1-Distilled</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DeepMount00__Llama-3.1-Distilled-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DeepMount00/Llama-3.1-Distilled
0a94c7ddb196107e8bf1b02e31488ff8c17b9eb3
29.631398
llama3
1
8.03
true
false
false
true
1.678
0.784379
78.437878
0.510088
30.841421
0.203172
20.317221
0.303691
7.158837
0.405812
10.126562
0.378158
30.906472
false
false
2024-10-25
2024-10-25
1
meta-llama/Meta-Llama-3-8B
ValiantLabs_Llama3.2-3B-Enigma_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ValiantLabs/Llama3.2-3B-Enigma" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ValiantLabs/Llama3.2-3B-Enigma</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ValiantLabs__Llama3.2-3B-Enigma-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ValiantLabs/Llama3.2-3B-Enigma
ca6adf3a289ce47c7598139e7a312e2b4b3708ce
11.692731
llama3.2
7
3.213
true
false
false
false
2.244793
0.278622
27.862183
0.372259
12.434026
0.043807
4.380665
0.261745
1.565996
0.392135
8.05026
0.242769
15.863254
false
false
2024-09-30
2024-10-02
1
meta-llama/Llama-3.2-3B-Instruct
NousResearch_Hermes-3-Llama-3.1-8B_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/NousResearch/Hermes-3-Llama-3.1-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">NousResearch/Hermes-3-Llama-3.1-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/NousResearch__Hermes-3-Llama-3.1-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
NousResearch/Hermes-3-Llama-3.1-8B
aabb745a717e133b74dcae23195d2635cf5f38cc
23.490877
llama3
303
8.03
true
false
false
true
0.905808
0.617017
61.701729
0.517745
30.724097
0.047583
4.758308
0.297819
6.375839
0.436938
13.617187
0.313913
23.7681
false
true
2024-07-28
2024-08-28
1
meta-llama/Meta-Llama-3.1-8B
aaditya_Llama3-OpenBioLLM-70B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/aaditya/Llama3-OpenBioLLM-70B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">aaditya/Llama3-OpenBioLLM-70B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/aaditya__Llama3-OpenBioLLM-70B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
aaditya/Llama3-OpenBioLLM-70B
5f79deaf38bc5f662943d304d59cb30357e8e5bd
34.97902
llama3
406
70
true
false
false
true
19.314045
0.759674
75.967433
0.639887
47.147075
0.19713
19.712991
0.322987
9.731544
0.441719
14.348177
0.486702
42.966903
false
false
2024-04-24
2024-08-30
2
meta-llama/Meta-Llama-3-70B
mattshumer_Reflection-Llama-3.1-70B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/mattshumer/Reflection-Llama-3.1-70B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mattshumer/Reflection-Llama-3.1-70B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mattshumer__Reflection-Llama-3.1-70B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
mattshumer/Reflection-Llama-3.1-70B
458962ed801fac4eadd01a91a2029a3a82f4cd84
24.392555
llama3.1
1,714
70.554
true
false
false
true
39.031041
0.004521
0.452134
0.645001
47.866237
0.214502
21.450151
0.363255
15.100671
0.457656
17.540365
0.495512
43.945774
false
false
2024-09-05
2024-12-25
2
meta-llama/Meta-Llama-3.1-70B
sequelbox_Llama3.1-8B-MOTH_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/sequelbox/Llama3.1-8B-MOTH" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sequelbox/Llama3.1-8B-MOTH</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sequelbox__Llama3.1-8B-MOTH-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sequelbox/Llama3.1-8B-MOTH
8db363e36b1efc9015ab14648e68bcfba9e8d8a0
20.836504
llama3.1
1
8.03
true
false
false
true
2.929274
0.524494
52.44939
0.490247
27.916332
0.121601
12.160121
0.268456
2.46085
0.368917
4.047917
0.33386
25.984412
false
false
2024-09-01
2024-09-19
2
meta-llama/Meta-Llama-3.1-8B
cognitivecomputations_dolphin-2.9.1-llama-3-70b_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/cognitivecomputations/dolphin-2.9.1-llama-3-70b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">cognitivecomputations/dolphin-2.9.1-llama-3-70b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/cognitivecomputations__dolphin-2.9.1-llama-3-70b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
cognitivecomputations/dolphin-2.9.1-llama-3-70b
31adf616c3c9176d147e0a62e9fedb7bf97678ac
25.534386
llama3
42
70.554
true
false
false
true
24.298176
0.376017
37.601675
0.520492
31.101152
0.182024
18.202417
0.308725
7.829978
0.497562
23.695312
0.412982
34.775783
false
true
2024-05-22
2024-06-27
1
meta-llama/Meta-Llama-3-70B
unsloth_Llama-3.2-1B-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/unsloth/Llama-3.2-1B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">unsloth/Llama-3.2-1B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/unsloth__Llama-3.2-1B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
unsloth/Llama-3.2-1B-Instruct
eb49081324edb2ff14f848ce16393c067c6f4976
14.532451
llama3.2
67
1.236
true
false
false
true
0.722639
0.580997
58.099731
0.34847
8.31685
0.082326
8.232628
0.267617
2.348993
0.319615
1.951823
0.174202
8.244681
false
false
2024-09-25
2025-01-23
1
meta-llama/Llama-3.2-1B-Instruct
MaziyarPanahi_Llama-3-70B-Instruct-v0.1_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/MaziyarPanahi/Llama-3-70B-Instruct-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MaziyarPanahi/Llama-3-70B-Instruct-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MaziyarPanahi__Llama-3-70B-Instruct-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
MaziyarPanahi/Llama-3-70B-Instruct-v0.1
6db1cb4256525fc5429734ddc0eb941d08d0be30
26.333913
llama3
1
70.554
true
false
false
true
22.527972
0.471438
47.143801
0.536626
32.712917
0.180514
18.05136
0.284396
4.58613
0.443302
15.31276
0.461769
40.196513
false
false
2024-05-14
2024-06-26
2
meta-llama/Meta-Llama-3-70B
MaziyarPanahi_calme-2.3-llama3-70b_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/MaziyarPanahi/calme-2.3-llama3-70b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MaziyarPanahi/calme-2.3-llama3-70b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MaziyarPanahi__calme-2.3-llama3-70b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
MaziyarPanahi/calme-2.3-llama3-70b
bd17453eaae0e36d1e1e17da13fdd155fce91a29
37.067032
llama3
4
70.554
true
false
false
true
19.273618
0.80104
80.104013
0.639917
48.008585
0.232628
23.26284
0.338087
11.744966
0.426125
12.565625
0.520445
46.716164
false
false
2024-04-27
2024-08-30
2
meta-llama/Meta-Llama-3-70B
ValiantLabs_Llama3.1-8B-Fireplace2_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ValiantLabs/Llama3.1-8B-Fireplace2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ValiantLabs/Llama3.1-8B-Fireplace2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ValiantLabs__Llama3.1-8B-Fireplace2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ValiantLabs/Llama3.1-8B-Fireplace2
ef129903bbdcc59efdbe10fe9061bff473334a99
18.570581
llama3.1
6
8.03
true
false
false
true
1.802195
0.532812
53.281183
0.461331
24.089954
0.087613
8.761329
0.28943
5.257271
0.336667
4.216667
0.242354
15.81708
false
false
2024-07-23
2024-08-10
2
meta-llama/Meta-Llama-3.1-8B
BAAI_OPI-Llama-3.1-8B-Instruct_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/BAAI/OPI-Llama-3.1-8B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BAAI/OPI-Llama-3.1-8B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BAAI__OPI-Llama-3.1-8B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BAAI/OPI-Llama-3.1-8B-Instruct
48504799d009b4e1b29e6d2948a7cde68acdc3b0
8.531604
llama3.1
2
8.03
true
false
false
true
1.343314
0.207455
20.745511
0.355122
9.768712
0.013595
1.359517
0.274329
3.243848
0.323302
3.579427
0.212434
12.492612
false
false
2024-09-06
2024-09-21
2
meta-llama/Meta-Llama-3.1-8B
haoranxu_Llama-3-Instruct-8B-SimPO_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/haoranxu/Llama-3-Instruct-8B-SimPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">haoranxu/Llama-3-Instruct-8B-SimPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/haoranxu__Llama-3-Instruct-8B-SimPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
haoranxu/Llama-3-Instruct-8B-SimPO
8346770280fa169d41d737785dd63a66e9d94501
24.990729
llama3
1
8.03
true
false
false
true
1.159156
0.734745
73.474492
0.497924
28.226376
0.087613
8.761329
0.290268
5.369128
0.356604
3.742188
0.373338
30.370863
false
false
2024-06-07
2024-07-28
1
meta-llama/Meta-Llama-3-8B-Instruct
ymcki_Llama-3.1-8B-GRPO-Instruct_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ymcki/Llama-3.1-8B-GRPO-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ymcki/Llama-3.1-8B-GRPO-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ymcki__Llama-3.1-8B-GRPO-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ymcki/Llama-3.1-8B-GRPO-Instruct
ae73ec53fb75499a33a506b354b55b29d02392b9
28.168376
llama3.1
0
8.03
true
false
false
true
0.656791
0.744537
74.453672
0.513159
30.353177
0.202417
20.241692
0.294463
5.928412
0.381656
7.607031
0.373836
30.426271
false
false
2025-02-20
2025-02-20
2
meta-llama/Meta-Llama-3.1-8B
Groq_Llama-3-Groq-8B-Tool-Use_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Groq/Llama-3-Groq-8B-Tool-Use" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Groq/Llama-3-Groq-8B-Tool-Use</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Groq__Llama-3-Groq-8B-Tool-Use-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Groq/Llama-3-Groq-8B-Tool-Use
3bf6b914d7043d1bbfcfc7a9aa7581a8104eabac
21.445601
llama3
274
8.03
true
false
false
true
1.008657
0.609823
60.982305
0.486338
27.254234
0.060423
6.042296
0.267617
2.348993
0.366031
5.38724
0.339927
26.65854
false
false
2024-06-24
2025-01-01
1
meta-llama/Meta-Llama-3-8B
akhadangi_Llama3.2.1B.0.1-Last_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Adapter
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/akhadangi/Llama3.2.1B.0.1-Last" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">akhadangi/Llama3.2.1B.0.1-Last</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/akhadangi__Llama3.2.1B.0.1-Last-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
akhadangi/Llama3.2.1B.0.1-Last
72a0652d9432dc62a296437a7880c0a0cb267097
3.266722
llama3.2
0
1.236
true
false
false
false
0.372928
0.094972
9.497245
0.316378
4.256106
0.021148
2.114804
0.238255
0
0.334063
1.757812
0.117769
1.974365
false
false
2025-03-10
2025-03-10
1
akhadangi/Llama3.2.1B.0.1-Last (Merge)
akhadangi_Llama3.2.1B.0.01-First_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/akhadangi/Llama3.2.1B.0.01-First" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">akhadangi/Llama3.2.1B.0.01-First</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/akhadangi__Llama3.2.1B.0.01-First-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
akhadangi/Llama3.2.1B.0.01-First
b9d1edeb95f15c92118f5b4677c2f97ca5523a3d
3.109892
llama3.2
0
1.236
true
false
false
false
0.370919
0.081359
8.135857
0.318919
4.766231
0.018127
1.812689
0.248322
0
0.319396
1.757812
0.119681
2.186761
false
false
2025-03-10
2025-03-10
1
akhadangi/Llama3.2.1B.0.01-First (Merge)
Nekochu_Llama-3.1-8B-German-ORPO_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Nekochu/Llama-3.1-8B-German-ORPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Nekochu/Llama-3.1-8B-German-ORPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Nekochu__Llama-3.1-8B-German-ORPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Nekochu/Llama-3.1-8B-German-ORPO
463ea77e46fb6d69c86f23df21b0ab0a0b9e77cd
23.254052
llama3.1
1
8.03
true
false
false
false
1.888395
0.461071
46.107107
0.498258
29.419254
0.117069
11.706949
0.316275
8.836689
0.46475
16.860417
0.339345
26.593898
false
false
2024-09-13
2024-09-24
2
meta-llama/Meta-Llama-3.1-8B
HumanLLMs_Humanish-LLama3-8B-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/HumanLLMs/Humanish-LLama3-8B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HumanLLMs/Humanish-LLama3-8B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HumanLLMs__Humanish-LLama3-8B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
HumanLLMs/Humanish-LLama3-8B-Instruct
42f73ada2b7fb16f18a75404d72b7911bf1e65ce
22.678204
llama3
18
8.03
true
false
false
true
1.496556
0.64979
64.979033
0.496771
28.012477
0.102719
10.271903
0.255872
0.782998
0.358156
2.002865
0.37018
30.019947
false
false
2024-10-04
2024-10-05
1
meta-llama/Meta-Llama-3-8B-Instruct
akhadangi_Llama3.2.1B.0.1-First_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/akhadangi/Llama3.2.1B.0.1-First" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">akhadangi/Llama3.2.1B.0.1-First</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/akhadangi__Llama3.2.1B.0.1-First-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
akhadangi/Llama3.2.1B.0.1-First
0b986d5ce44bb9c0f3f3fd31442353d590b37700
3.269241
llama3.2
0
1.236
true
false
false
false
0.367464
0.100093
10.009331
0.311962
4.177
0.021148
2.114804
0.244966
0
0.330125
1.432292
0.116938
1.882018
false
false
2025-03-10
2025-03-10
1
akhadangi/Llama3.2.1B.0.1-First (Merge)
akhadangi_Llama3.2.1B.0.01-Last_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/akhadangi/Llama3.2.1B.0.01-Last" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">akhadangi/Llama3.2.1B.0.01-Last</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/akhadangi__Llama3.2.1B.0.01-Last-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
akhadangi/Llama3.2.1B.0.01-Last
30ff8e568c712c27f8c4990a988115817ef604fb
3.26213
llama3.2
0
1.236
true
false
false
false
0.367426
0.09165
9.165015
0.315928
4.282945
0.013595
1.359517
0.243289
0
0.320635
2.246094
0.122673
2.519208
false
false
2025-03-10
2025-03-10
1
akhadangi/Llama3.2.1B.0.01-Last (Merge)
AI-Sweden-Models_Llama-3-8B-instruct_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/AI-Sweden-Models/Llama-3-8B-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">AI-Sweden-Models/Llama-3-8B-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/AI-Sweden-Models__Llama-3-8B-instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
AI-Sweden-Models/Llama-3-8B-instruct
4e1c955228bdb4d69c1c4560e8d5872312a8f033
14.34367
llama3
10
8.03
true
false
false
true
2.332222
0.240128
24.012841
0.417346
18.388096
0.03852
3.851964
0.26594
2.12528
0.477094
19.936719
0.259724
17.747119
false
false
2024-06-01
2024-06-27
2
meta-llama/Meta-Llama-3-8B
Nexesenex_Llama_3.1_8b_DodoWild_v2.02_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Nexesenex/Llama_3.1_8b_DodoWild_v2.02" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Nexesenex/Llama_3.1_8b_DodoWild_v2.02</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Nexesenex__Llama_3.1_8b_DodoWild_v2.02-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Nexesenex/Llama_3.1_8b_DodoWild_v2.02
30.733559
llama3.1
2
8.03
true
false
false
true
0.686035
0.80169
80.168952
0.526174
32.319153
0.227341
22.734139
0.30453
7.270694
0.397062
11.232812
0.37608
30.675606
true
false
2025-02-28
2025-02-28
1
Nexesenex/Llama_3.1_8b_DodoWild_v2.02 (Merge)
togethercomputer_LLaMA-2-7B-32K_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/togethercomputer/LLaMA-2-7B-32K" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">togethercomputer/LLaMA-2-7B-32K</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/togethercomputer__LLaMA-2-7B-32K-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
togethercomputer/LLaMA-2-7B-32K
46c24bb5aef59722fa7aa6d75e832afd1d64b980
6.837716
llama2
538
7
true
false
false
false
1.169146
0.186497
18.649738
0.339952
8.089984
0.01435
1.435045
0.25
0
0.375365
4.320573
0.176779
8.530954
false
true
2023-07-26
2024-06-12
0
togethercomputer/LLaMA-2-7B-32K
oopere_pruned60-llama-3.2-3b_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/oopere/pruned60-llama-3.2-3b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">oopere/pruned60-llama-3.2-3b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/oopere__pruned60-llama-3.2-3b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
oopere/pruned60-llama-3.2-3b
c8c061d55288274a59205fa740b51a951ca93335
5.128681
llama3.2
0
1.944
true
false
false
false
1.241768
0.182476
18.247583
0.316626
3.988402
0.003776
0.377644
0.270134
2.684564
0.363333
4.016667
0.113115
1.457225
false
false
2024-12-12
2024-12-13
1
oopere/pruned60-llama-3.2-3b (Merge)
oopere_pruned20-llama-3.2-3b_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/oopere/pruned20-llama-3.2-3b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">oopere/pruned20-llama-3.2-3b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/oopere__pruned20-llama-3.2-3b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
oopere/pruned20-llama-3.2-3b
e92642870b0ad66e589889305608f422ee9be975
5.65656
llama3.2
0
2.79
true
false
false
false
1.212079
0.178879
17.887871
0.324785
6.332745
0.015861
1.586103
0.26594
2.12528
0.341844
2.897135
0.127992
3.110225
false
false
2024-12-12
2024-12-12
1
oopere/pruned20-llama-3.2-3b (Merge)
oopere_pruned10-llama-3.2-3B_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/oopere/pruned10-llama-3.2-3B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">oopere/pruned10-llama-3.2-3B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/oopere__pruned10-llama-3.2-3B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
oopere/pruned10-llama-3.2-3B
5958def83347d0a8f8b95d27e7cdff37329b988c
6.919943
llama3.2
0
3.001
true
false
false
false
1.321068
0.17763
17.76298
0.334042
7.759477
0.019637
1.963746
0.266779
2.237136
0.372167
4.6875
0.163979
7.108821
false
false
2024-12-22
2024-12-22
1
oopere/pruned10-llama-3.2-3B (Merge)
Nexesenex_Llama_3.1_8b_Hermedash_R1_V1.04_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Nexesenex/Llama_3.1_8b_Hermedash_R1_V1.04" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Nexesenex/Llama_3.1_8b_Hermedash_R1_V1.04</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Nexesenex__Llama_3.1_8b_Hermedash_R1_V1.04-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Nexesenex/Llama_3.1_8b_Hermedash_R1_V1.04
8625849c98769300f69b6b7695e02482c7f4f0b3
30.27777
llama3.1
1
8.03
true
false
false
true
0.669668
0.787151
78.715142
0.519164
31.658972
0.186556
18.655589
0.322987
9.731544
0.411052
10.88151
0.388215
32.023862
true
false
2025-02-28
2025-02-28
1
Nexesenex/Llama_3.1_8b_Hermedash_R1_V1.04 (Merge)
Magpie-Align_Llama-3.1-8B-Magpie-Align-v0.1_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Magpie-Align/Llama-3.1-8B-Magpie-Align-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Magpie-Align/Llama-3.1-8B-Magpie-Align-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Magpie-Align__Llama-3.1-8B-Magpie-Align-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Magpie-Align/Llama-3.1-8B-Magpie-Align-v0.1
dd34258a5f2bf7630b5a8e5662b050c60a088927
17.546855
llama3.1
3
8.03
true
false
false
true
1.416082
0.445784
44.578385
0.46224
24.040537
0.066465
6.646526
0.263423
1.789709
0.314062
3.091146
0.326213
25.134826
false
false
2024-07-24
2024-09-17
2
meta-llama/Meta-Llama-3.1-8B
arcee-ai_Llama-3.1-SuperNova-Lite_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/arcee-ai/Llama-3.1-SuperNova-Lite" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">arcee-ai/Llama-3.1-SuperNova-Lite</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/arcee-ai__Llama-3.1-SuperNova-Lite-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
arcee-ai/Llama-3.1-SuperNova-Lite
76246ca4448c1a11787daee0958b60ab27f17774
30.193464
llama3
189
8.03
true
false
false
true
1.711987
0.801739
80.173938
0.515199
31.57234
0.182779
18.277946
0.306208
7.494407
0.416323
11.673698
0.387716
31.968454
false
false
2024-09-10
2024-09-17
2
meta-llama/Meta-Llama-3.1-8B
nvidia_Llama-3.1-Nemotron-70B-Instruct-HF_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/nvidia/Llama-3.1-Nemotron-70B-Instruct-HF" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nvidia/Llama-3.1-Nemotron-70B-Instruct-HF</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nvidia__Llama-3.1-Nemotron-70B-Instruct-HF-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nvidia/Llama-3.1-Nemotron-70B-Instruct-HF
250db5cf2323e04a6d2025a2ca2b94a95c439e88
36.907173
llama3.1
2,028
70.554
true
false
false
true
27.257495
0.738067
73.806722
0.6316
47.10953
0.426737
42.673716
0.258389
1.118568
0.43276
13.195052
0.491855
43.53945
false
true
2024-10-12
2024-10-16
2
meta-llama/Meta-Llama-3.1-70B
SentientAGI_Dobby-Mini-Leashed-Llama-3.1-8B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/SentientAGI/Dobby-Mini-Leashed-Llama-3.1-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">SentientAGI/Dobby-Mini-Leashed-Llama-3.1-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/SentientAGI__Dobby-Mini-Leashed-Llama-3.1-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
SentientAGI/Dobby-Mini-Leashed-Llama-3.1-8B
a6f1c58d657b18f1dc507dbd8db1a79089c4a05d
29.438992
llama3.1
10
8.03
true
false
false
true
1.534644
0.784703
78.470348
0.513805
30.773045
0.185801
18.58006
0.302013
6.935123
0.425375
11.938542
0.369432
29.936835
false
false
2025-01-22
2025-02-04
2
meta-llama/Meta-Llama-3.1-8B
SentientAGI_Dobby-Mini-Unhinged-Llama-3.1-8B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/SentientAGI/Dobby-Mini-Unhinged-Llama-3.1-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">SentientAGI/Dobby-Mini-Unhinged-Llama-3.1-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/SentientAGI__Dobby-Mini-Unhinged-Llama-3.1-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
SentientAGI/Dobby-Mini-Unhinged-Llama-3.1-8B
23f4c4a863f933238505391f49af552e1cf2c2ad
27.457565
llama3.1
35
8.03
true
false
false
true
1.351299
0.745686
74.568589
0.514244
30.369934
0.156344
15.634441
0.306208
7.494407
0.401281
7.960156
0.358461
28.717863
false
false
2025-01-22
2025-02-04
2
meta-llama/Meta-Llama-3.1-8B
ymcki_Llama-3.1-8B-SFT-GRPO-Instruct_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ymcki/Llama-3.1-8B-SFT-GRPO-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ymcki/Llama-3.1-8B-SFT-GRPO-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ymcki__Llama-3.1-8B-SFT-GRPO-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ymcki/Llama-3.1-8B-SFT-GRPO-Instruct
9d8bfc910b2be95b38a3738a938c7abf575892ac
7.659155
llama3.1
0
8.03
true
false
false
true
0.752124
0.3354
33.540007
0.312626
4.467783
0.04003
4.003021
0.253356
0.447427
0.352604
2.408854
0.109791
1.08784
false
false
2025-03-12
2025-03-12
2
meta-llama/Meta-Llama-3.1-8B
Nexesenex_Llama_3.2_1b_OpenTree_R1_0.1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Nexesenex/Llama_3.2_1b_OpenTree_R1_0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Nexesenex/Llama_3.2_1b_OpenTree_R1_0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Nexesenex__Llama_3.2_1b_OpenTree_R1_0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Nexesenex/Llama_3.2_1b_OpenTree_R1_0.1
84f5694efc7f14c835b07b961b924bf74033b841
12.343256
llama3.2
1
1.498
true
false
false
true
0.366972
0.536634
53.663391
0.327952
6.20559
0.047583
4.758308
0.252517
0.33557
0.313073
1.6
0.16747
7.496676
true
false
2025-02-22
2025-02-22
1
Nexesenex/Llama_3.2_1b_OpenTree_R1_0.1 (Merge)
Nexesenex_Llama_3.2_1b_Odyssea_V1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Nexesenex/Llama_3.2_1b_Odyssea_V1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Nexesenex/Llama_3.2_1b_Odyssea_V1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Nexesenex__Llama_3.2_1b_Odyssea_V1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Nexesenex/Llama_3.2_1b_Odyssea_V1
bd402805c92e2b707afa506950e4057226104021
5.724136
llama3.2
0
1.498
true
false
false
true
0.371238
0.255266
25.526603
0.300972
2.646703
0.01435
1.435045
0.258389
1.118568
0.339365
1.920573
0.115276
1.697326
true
false
2025-02-22
2025-02-22
1
Nexesenex/Llama_3.2_1b_Odyssea_V1 (Merge)
Nexesenex_Llama_3.1_8b_Typhoon_v1.03_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Nexesenex/Llama_3.1_8b_Typhoon_v1.03" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Nexesenex/Llama_3.1_8b_Typhoon_v1.03</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Nexesenex__Llama_3.1_8b_Typhoon_v1.03-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Nexesenex/Llama_3.1_8b_Typhoon_v1.03
67b0226762ae10e3600657831bb0f5e144057036
30.634802
llama3.1
1
8.03
true
false
false
true
0.686383
0.807834
80.783432
0.531397
33.32078
0.227341
22.734139
0.307047
7.606264
0.381469
7.783594
0.384225
31.5806
true
false
2025-02-28
2025-02-28
1
Nexesenex/Llama_3.1_8b_Typhoon_v1.03 (Merge)
Nexesenex_Llama_3.1_8b_Smarteaz_V1.01_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Nexesenex/Llama_3.1_8b_Smarteaz_V1.01" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Nexesenex/Llama_3.1_8b_Smarteaz_V1.01</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Nexesenex__Llama_3.1_8b_Smarteaz_V1.01-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Nexesenex/Llama_3.1_8b_Smarteaz_V1.01
195c80b1bf8431108d0f7bb87c3e84277793f437
30.623634
llama3.1
3
8.03
true
false
false
true
0.699303
0.815128
81.51283
0.524127
32.275457
0.234139
23.413897
0.309564
7.941834
0.378927
8.199219
0.373587
30.398567
true
false
2025-02-27
2025-02-27
1
Nexesenex/Llama_3.1_8b_Smarteaz_V1.01 (Merge)
HiroseKoichi_Llama-Salad-4x8B-V3_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/HiroseKoichi/Llama-Salad-4x8B-V3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HiroseKoichi/Llama-Salad-4x8B-V3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HiroseKoichi__Llama-Salad-4x8B-V3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
HiroseKoichi/Llama-Salad-4x8B-V3
a343915429779efbd1478f01ba1f7fd9d8d226c0
24.922702
llama3
6
24.942
true
true
false
true
4.27539
0.665352
66.535238
0.524465
31.928849
0.095921
9.592145
0.302852
7.04698
0.374031
6.453906
0.351812
27.979093
true
false
2024-06-17
2024-06-26
0
HiroseKoichi/Llama-Salad-4x8B-V3
Nexesenex_Llama_3.2_1b_SunOrca_V1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Nexesenex/Llama_3.2_1b_SunOrca_V1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Nexesenex/Llama_3.2_1b_SunOrca_V1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Nexesenex__Llama_3.2_1b_SunOrca_V1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Nexesenex/Llama_3.2_1b_SunOrca_V1
d4113f570bf95a718f508728fc59f6e99080d5a7
14.008724
llama3.2
0
1.498
true
false
false
true
0.3575
0.542954
54.295381
0.343064
7.852676
0.067221
6.722054
0.274329
3.243848
0.32625
2.114583
0.188414
9.823803
true
false
2025-03-06
2025-03-06
1
Nexesenex/Llama_3.2_1b_SunOrca_V1 (Merge)
oopere_pruned60-llama-1b_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/oopere/pruned60-llama-1b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">oopere/pruned60-llama-1b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/oopere__pruned60-llama-1b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
oopere/pruned60-llama-1b
86b157256928b50ee07cc3cf5b3884b70062f2fe
5.467567
llama3.2
1
0.753
true
false
false
false
0.764976
0.18285
18.285039
0.301619
2.942526
0.002266
0.226586
0.249161
0
0.408792
9.432292
0.117271
1.918957
false
false
2024-11-16
2024-11-25
1
oopere/pruned60-llama-1b (Merge)
Nexesenex_Llama_3.1_8b_DodoWild_v2.01_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Nexesenex/Llama_3.1_8b_DodoWild_v2.01" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Nexesenex/Llama_3.1_8b_DodoWild_v2.01</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Nexesenex__Llama_3.1_8b_DodoWild_v2.01-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Nexesenex/Llama_3.1_8b_DodoWild_v2.01
5b6413555d32172bc6372075f0a31fbf7895723b
30.309649
llama3.1
2
8.031
true
false
false
true
0.69176
0.797768
79.77677
0.525276
32.110872
0.19864
19.864048
0.303691
7.158837
0.408969
12.521094
0.373836
30.426271
true
false
2025-02-27
2025-02-27
1
Nexesenex/Llama_3.1_8b_DodoWild_v2.01 (Merge)
theprint_CleverBoi-Llama-3.1-8B-v2_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Adapter
?
<a target="_blank" href="https://huggingface.co/theprint/CleverBoi-Llama-3.1-8B-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">theprint/CleverBoi-Llama-3.1-8B-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/theprint__CleverBoi-Llama-3.1-8B-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
theprint/CleverBoi-Llama-3.1-8B-v2
a8b0fc584b10e0110e04f9d21c7f10d24391c1d5
14.145588
apache-2.0
0
9.3
true
false
false
false
5.042759
0.19614
19.613958
0.466782
24.132845
0.05287
5.287009
0.286074
4.809843
0.373469
6.716927
0.318816
24.312943
false
false
2024-09-15
2024-09-22
2
meta-llama/Meta-Llama-3.1-8B
theprint_ReWiz-Llama-3.1-8B-v2_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Adapter
?
<a target="_blank" href="https://huggingface.co/theprint/ReWiz-Llama-3.1-8B-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">theprint/ReWiz-Llama-3.1-8B-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/theprint__ReWiz-Llama-3.1-8B-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
theprint/ReWiz-Llama-3.1-8B-v2
a8b0fc584b10e0110e04f9d21c7f10d24391c1d5
15.893328
apache-2.0
1
9.3
true
false
false
false
4.70618
0.237905
23.790542
0.463243
23.773287
0.057402
5.740181
0.302852
7.04698
0.381375
9.338542
0.331034
25.670434
false
false
2024-11-02
2024-11-03
2
meta-llama/Meta-Llama-3.1-8B
DeepMount00_Llama-3.1-8b-Ita_bfloat16
bfloat16
❓ other
Original
Unknown
<a target="_blank" href="https://huggingface.co/DeepMount00/Llama-3.1-8b-Ita" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DeepMount00/Llama-3.1-8b-Ita</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DeepMount00__Llama-3.1-8b-Ita-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DeepMount00/Llama-3.1-8b-Ita
5ede1e388b6b15bc06acd364a8f805fe9ed16db9
26.265732
6
0
false
false
false
false
0.906247
0.536484
53.648431
0.517
31.333639
0.170695
17.069486
0.306208
7.494407
0.448719
15.15651
0.396027
32.891918
false
false
2024-08-13
2
meta-llama/Meta-Llama-3.1-8B
argilla-warehouse_Llama-3.1-8B-MagPie-Ultra_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/argilla-warehouse/Llama-3.1-8B-MagPie-Ultra" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">argilla-warehouse/Llama-3.1-8B-MagPie-Ultra</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/argilla-warehouse__Llama-3.1-8B-MagPie-Ultra-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
argilla-warehouse/Llama-3.1-8B-MagPie-Ultra
1e12f20ca5db84f65a6db793a65100433aac0ac6
19.848991
llama3.1
1
8.03
true
false
false
true
1.931354
0.575651
57.565149
0.461961
23.51631
0.077039
7.703927
0.266779
2.237136
0.35425
4.247917
0.314412
23.823508
false
true
2024-09-26
2024-09-30
1
meta-llama/Llama-3.1-8B
MaziyarPanahi_calme-2.1-llama3.1-70b_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/MaziyarPanahi/calme-2.1-llama3.1-70b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MaziyarPanahi/calme-2.1-llama3.1-70b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MaziyarPanahi__calme-2.1-llama3.1-70b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
MaziyarPanahi/calme-2.1-llama3.1-70b
f39ad1c90b0f30379e80756d29c6533cf84c362a
40.936033
4
70.554
false
false
false
true
30.909679
0.84343
84.342988
0.644755
48.553646
0.410121
41.012085
0.32802
10.402685
0.438031
13.720573
0.528258
47.58422
false
false
2024-07-23
2024-07-24
2
meta-llama/Meta-Llama-3.1-70B
Locutusque_Llama-3-NeuralHercules-5.0-8B_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Locutusque/Llama-3-NeuralHercules-5.0-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Locutusque/Llama-3-NeuralHercules-5.0-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Locutusque__Llama-3-NeuralHercules-5.0-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Locutusque/Llama-3-NeuralHercules-5.0-8B
2bbb675e592a1772f2389fe2d58a5b610d479d94
16.042476
llama3
2
8.03
true
false
false
true
1.78143
0.448931
44.893106
0.394047
16.342072
0.043051
4.305136
0.268456
2.46085
0.388073
6.775781
0.293301
21.477911
false
false
2024-05-28
2024-06-26
0
Locutusque/Llama-3-NeuralHercules-5.0-8B