minhtien2405 commited on
Commit
c78a5f3
·
verified ·
1 Parent(s): 539601b

Model save

Browse files
README.md ADDED
@@ -0,0 +1,202 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ library_name: transformers
3
+ license: cc-by-nc-4.0
4
+ base_model: nguyenvulebinh/wav2vec2-base-vietnamese-250h
5
+ tags:
6
+ - generated_from_trainer
7
+ metrics:
8
+ - wer
9
+ model-index:
10
+ - name: wav2vec2-base-north-vi
11
+ results: []
12
+ ---
13
+
14
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
15
+ should probably proofread and complete it, then remove this comment. -->
16
+
17
+ # wav2vec2-base-north-vi
18
+
19
+ This model is a fine-tuned version of [nguyenvulebinh/wav2vec2-base-vietnamese-250h](https://huggingface.co/nguyenvulebinh/wav2vec2-base-vietnamese-250h) on an unknown dataset.
20
+ It achieves the following results on the evaluation set:
21
+ - Loss: 0.3268
22
+ - Wer: 0.1288
23
+
24
+ ## Model description
25
+
26
+ More information needed
27
+
28
+ ## Intended uses & limitations
29
+
30
+ More information needed
31
+
32
+ ## Training and evaluation data
33
+
34
+ More information needed
35
+
36
+ ## Training procedure
37
+
38
+ ### Training hyperparameters
39
+
40
+ The following hyperparameters were used during training:
41
+ - learning_rate: 0.0003
42
+ - train_batch_size: 4
43
+ - eval_batch_size: 4
44
+ - seed: 42
45
+ - gradient_accumulation_steps: 8
46
+ - total_train_batch_size: 32
47
+ - optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
48
+ - lr_scheduler_type: linear
49
+ - lr_scheduler_warmup_steps: 20
50
+ - num_epochs: 30
51
+ - mixed_precision_training: Native AMP
52
+
53
+ ### Training results
54
+
55
+ | Training Loss | Epoch | Step | Validation Loss | Wer |
56
+ |:-------------:|:-------:|:----:|:---------------:|:------:|
57
+ | 1.2171 | 0.2164 | 40 | 0.3531 | 0.2336 |
58
+ | 0.527 | 0.4327 | 80 | 0.3674 | 0.1767 |
59
+ | 0.5245 | 0.6491 | 120 | 0.3467 | 0.1967 |
60
+ | 0.5189 | 0.8654 | 160 | 0.3635 | 0.1812 |
61
+ | 0.4589 | 1.0811 | 200 | 0.3410 | 0.1807 |
62
+ | 0.442 | 1.2975 | 240 | 0.3382 | 0.1764 |
63
+ | 0.4528 | 1.5139 | 280 | 0.3404 | 0.1713 |
64
+ | 1.1344 | 1.7302 | 320 | 0.3403 | 0.1843 |
65
+ | 0.4726 | 1.9466 | 360 | 0.3365 | 0.1762 |
66
+ | 0.475 | 2.1623 | 400 | 0.3442 | 0.1729 |
67
+ | 0.4345 | 2.3786 | 440 | 0.3317 | 0.1706 |
68
+ | 0.4249 | 2.5950 | 480 | 0.3149 | 0.1769 |
69
+ | 0.4385 | 2.8114 | 520 | 0.3281 | 0.1646 |
70
+ | 1.119 | 3.0270 | 560 | 0.3422 | 0.1613 |
71
+ | 0.4082 | 3.2434 | 600 | 0.3449 | 0.1680 |
72
+ | 1.0262 | 3.4598 | 640 | 0.3459 | 0.1630 |
73
+ | 0.411 | 3.6761 | 680 | 0.3157 | 0.1744 |
74
+ | 0.3922 | 3.8925 | 720 | 0.3347 | 0.1672 |
75
+ | 0.408 | 4.1082 | 760 | 0.3260 | 0.1619 |
76
+ | 0.3922 | 4.3245 | 800 | 0.3212 | 0.1718 |
77
+ | 0.4002 | 4.5409 | 840 | 0.3212 | 0.2031 |
78
+ | 0.399 | 4.7573 | 880 | 0.3207 | 0.1677 |
79
+ | 0.418 | 4.9736 | 920 | 0.3392 | 0.1605 |
80
+ | 0.379 | 5.1893 | 960 | 0.3145 | 0.1718 |
81
+ | 0.3729 | 5.4057 | 1000 | 0.3234 | 0.1665 |
82
+ | 0.3675 | 5.6220 | 1040 | 0.3262 | 0.1663 |
83
+ | 0.3941 | 5.8384 | 1080 | 0.3375 | 0.1580 |
84
+ | 0.3763 | 6.0541 | 1120 | 0.3199 | 0.1701 |
85
+ | 0.3567 | 6.2705 | 1160 | 0.3267 | 0.1651 |
86
+ | 0.3521 | 6.4868 | 1200 | 0.3184 | 0.1572 |
87
+ | 0.3464 | 6.7032 | 1240 | 0.3357 | 0.1621 |
88
+ | 0.3413 | 6.9195 | 1280 | 0.3094 | 0.1590 |
89
+ | 0.3563 | 7.1352 | 1320 | 0.3343 | 0.1600 |
90
+ | 0.356 | 7.3516 | 1360 | 0.3285 | 0.1561 |
91
+ | 0.3599 | 7.5680 | 1400 | 0.3299 | 0.1573 |
92
+ | 0.3497 | 7.7843 | 1440 | 0.3299 | 0.1540 |
93
+ | 0.8469 | 8.0 | 1480 | 0.3199 | 0.1549 |
94
+ | 0.3352 | 8.2164 | 1520 | 0.3173 | 0.1670 |
95
+ | 0.3362 | 8.4327 | 1560 | 0.3256 | 0.1550 |
96
+ | 0.3468 | 8.6491 | 1600 | 0.3241 | 0.1615 |
97
+ | 0.3352 | 8.8654 | 1640 | 0.3219 | 0.1580 |
98
+ | 0.3588 | 9.0811 | 1680 | 0.3256 | 0.1534 |
99
+ | 0.3077 | 9.2975 | 1720 | 0.3384 | 0.1564 |
100
+ | 0.3169 | 9.5139 | 1760 | 0.3272 | 0.1495 |
101
+ | 0.3406 | 9.7302 | 1800 | 0.3249 | 0.1553 |
102
+ | 0.3341 | 9.9466 | 1840 | 0.3250 | 0.1531 |
103
+ | 0.3071 | 10.1623 | 1880 | 0.3522 | 0.1493 |
104
+ | 0.2924 | 10.3786 | 1920 | 0.3201 | 0.1553 |
105
+ | 0.3378 | 10.5950 | 1960 | 0.3238 | 0.1528 |
106
+ | 0.3234 | 10.8114 | 2000 | 0.3344 | 0.1555 |
107
+ | 0.3143 | 11.0270 | 2040 | 0.3269 | 0.1558 |
108
+ | 0.3023 | 11.2434 | 2080 | 0.3220 | 0.1565 |
109
+ | 0.2961 | 11.4598 | 2120 | 0.3187 | 0.1721 |
110
+ | 0.2995 | 11.6761 | 2160 | 0.3464 | 0.1527 |
111
+ | 0.3251 | 11.8925 | 2200 | 0.3225 | 0.1539 |
112
+ | 0.3395 | 12.1082 | 2240 | 0.3360 | 0.1539 |
113
+ | 0.3158 | 12.3245 | 2280 | 0.3173 | 0.1496 |
114
+ | 0.311 | 12.5409 | 2320 | 0.3302 | 0.1473 |
115
+ | 0.284 | 12.7573 | 2360 | 0.3399 | 0.1500 |
116
+ | 0.3092 | 12.9736 | 2400 | 0.3245 | 0.1509 |
117
+ | 0.3245 | 13.1893 | 2440 | 0.3281 | 0.1503 |
118
+ | 0.3889 | 13.4057 | 2480 | 0.3419 | 0.1495 |
119
+ | 0.2609 | 13.6220 | 2520 | 0.3403 | 0.1480 |
120
+ | 0.2769 | 13.8384 | 2560 | 0.3264 | 0.1483 |
121
+ | 0.2643 | 14.0541 | 2600 | 0.3315 | 0.1574 |
122
+ | 0.2804 | 14.2705 | 2640 | 0.3357 | 0.1489 |
123
+ | 0.2668 | 14.4868 | 2680 | 0.3186 | 0.1456 |
124
+ | 0.2739 | 14.7032 | 2720 | 0.3407 | 0.1492 |
125
+ | 0.263 | 14.9195 | 2760 | 0.3306 | 0.1491 |
126
+ | 0.2582 | 15.1352 | 2800 | 0.3307 | 0.1473 |
127
+ | 0.2787 | 15.3516 | 2840 | 0.3310 | 0.1520 |
128
+ | 0.276 | 15.5680 | 2880 | 0.3270 | 0.1486 |
129
+ | 0.2758 | 15.7843 | 2920 | 0.3370 | 0.1471 |
130
+ | 0.292 | 16.0 | 2960 | 0.3456 | 0.1451 |
131
+ | 0.2643 | 16.2164 | 3000 | 0.3384 | 0.1499 |
132
+ | 0.2707 | 16.4327 | 3040 | 0.3460 | 0.1444 |
133
+ | 0.2606 | 16.6491 | 3080 | 0.3355 | 0.1462 |
134
+ | 0.2554 | 16.8654 | 3120 | 0.3534 | 0.1441 |
135
+ | 0.2484 | 17.0811 | 3160 | 0.3466 | 0.1517 |
136
+ | 0.231 | 17.2975 | 3200 | 0.3353 | 0.1454 |
137
+ | 0.2502 | 17.5139 | 3240 | 0.3406 | 0.1464 |
138
+ | 0.2574 | 17.7302 | 3280 | 0.3347 | 0.1451 |
139
+ | 0.2339 | 17.9466 | 3320 | 0.3430 | 0.1490 |
140
+ | 0.2305 | 18.1623 | 3360 | 0.3472 | 0.1476 |
141
+ | 0.2415 | 18.3786 | 3400 | 0.3393 | 0.1455 |
142
+ | 0.2579 | 18.5950 | 3440 | 0.3396 | 0.1466 |
143
+ | 0.254 | 18.8114 | 3480 | 0.3443 | 0.1436 |
144
+ | 0.2292 | 19.0270 | 3520 | 0.3503 | 0.1454 |
145
+ | 0.2358 | 19.2434 | 3560 | 0.3547 | 0.1447 |
146
+ | 0.231 | 19.4598 | 3600 | 0.3545 | 0.1436 |
147
+ | 0.2542 | 19.6761 | 3640 | 0.3432 | 0.1426 |
148
+ | 0.2466 | 19.8925 | 3680 | 0.3539 | 0.1403 |
149
+ | 0.2367 | 20.1082 | 3720 | 0.3458 | 0.1453 |
150
+ | 0.2196 | 20.3245 | 3760 | 0.3460 | 0.1412 |
151
+ | 0.2126 | 20.5409 | 3800 | 0.3539 | 0.1466 |
152
+ | 0.2254 | 20.7573 | 3840 | 0.3561 | 0.1400 |
153
+ | 0.2301 | 20.9736 | 3880 | 0.3446 | 0.1428 |
154
+ | 0.2157 | 21.1893 | 3920 | 0.3542 | 0.1432 |
155
+ | 0.2157 | 21.4057 | 3960 | 0.3557 | 0.1400 |
156
+ | 0.2172 | 21.6220 | 4000 | 0.3438 | 0.1408 |
157
+ | 0.1969 | 21.8384 | 4040 | 0.3538 | 0.1451 |
158
+ | 0.2001 | 22.0541 | 4080 | 0.3578 | 0.1415 |
159
+ | 0.23 | 22.2705 | 4120 | 0.3501 | 0.1414 |
160
+ | 0.2285 | 22.4868 | 4160 | 0.3622 | 0.1403 |
161
+ | 0.2049 | 22.7032 | 4200 | 0.3649 | 0.1397 |
162
+ | 0.2228 | 22.9195 | 4240 | 0.3602 | 0.1391 |
163
+ | 0.2393 | 23.1352 | 4280 | 0.3624 | 0.1386 |
164
+ | 0.2116 | 23.3516 | 4320 | 0.3548 | 0.1374 |
165
+ | 0.256 | 23.5680 | 4360 | 0.3536 | 0.1399 |
166
+ | 0.2157 | 23.7843 | 4400 | 0.3670 | 0.1380 |
167
+ | 0.2155 | 24.0 | 4440 | 0.3596 | 0.1399 |
168
+ | 0.1938 | 24.2164 | 4480 | 0.3637 | 0.1407 |
169
+ | 0.1972 | 24.4327 | 4520 | 0.3733 | 0.1372 |
170
+ | 0.2142 | 24.6491 | 4560 | 0.3579 | 0.1399 |
171
+ | 0.2092 | 24.8654 | 4600 | 0.3647 | 0.1361 |
172
+ | 0.3059 | 25.0811 | 4640 | 0.3707 | 0.1387 |
173
+ | 0.2014 | 25.2975 | 4680 | 0.3723 | 0.1352 |
174
+ | 0.2116 | 25.5139 | 4720 | 0.3629 | 0.1374 |
175
+ | 0.1854 | 25.7302 | 4760 | 0.3624 | 0.1371 |
176
+ | 0.2074 | 25.9466 | 4800 | 0.3873 | 0.1345 |
177
+ | 0.2034 | 26.1623 | 4840 | 0.3603 | 0.1376 |
178
+ | 0.1893 | 26.3786 | 4880 | 0.3761 | 0.1369 |
179
+ | 0.1859 | 26.5950 | 4920 | 0.3737 | 0.1354 |
180
+ | 0.2076 | 26.8114 | 4960 | 0.3528 | 0.1372 |
181
+ | 0.1879 | 27.0270 | 5000 | 0.3657 | 0.1356 |
182
+ | 0.1927 | 27.2434 | 5040 | 0.3637 | 0.1351 |
183
+ | 0.2059 | 27.4598 | 5080 | 0.3789 | 0.1341 |
184
+ | 0.1751 | 27.6761 | 5120 | 0.3671 | 0.1355 |
185
+ | 0.1864 | 27.8925 | 5160 | 0.3657 | 0.1348 |
186
+ | 0.1822 | 28.1082 | 5200 | 0.3653 | 0.1358 |
187
+ | 0.1955 | 28.3245 | 5240 | 0.3719 | 0.1356 |
188
+ | 0.194 | 28.5409 | 5280 | 0.3706 | 0.1360 |
189
+ | 0.1888 | 28.7573 | 5320 | 0.3700 | 0.1358 |
190
+ | 0.1954 | 28.9736 | 5360 | 0.3664 | 0.1347 |
191
+ | 0.1897 | 29.1893 | 5400 | 0.3687 | 0.1350 |
192
+ | 0.1851 | 29.4057 | 5440 | 0.3664 | 0.1356 |
193
+ | 0.182 | 29.6220 | 5480 | 0.3674 | 0.1354 |
194
+ | 0.187 | 29.8384 | 5520 | 0.3651 | 0.1348 |
195
+
196
+
197
+ ### Framework versions
198
+
199
+ - Transformers 4.53.0
200
+ - Pytorch 2.7.1+cu126
201
+ - Datasets 3.6.0
202
+ - Tokenizers 0.21.2
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:fc3d9e1a29b7288d4c2a4e0287db18ed99f75f4be3c7a84c6975269a3e737209
3
  size 377851056
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:e04d88f143ac6b6388da82465ed5d98dc7e30844ca104d1695e0c6738cfc6ef3
3
  size 377851056
wav2vec_base_north_vi_250h_wer_history.json ADDED
@@ -0,0 +1 @@
 
 
1
+ [[40, 0.23355774210253755], [80, 0.17665219296498427], [120, 0.196709556626698], [160, 0.1811536469744652], [200, 0.18073537027446918], [240, 0.17643309564593873], [280, 0.17125443174122615], [320, 0.18430068119348286], [360, 0.17615424451260805], [400, 0.17290762060311515], [440, 0.1706170577221846], [480, 0.1769111261602199], [520, 0.16464167629367008], [560, 0.1612755447556069], [600, 0.16800780783173325], [640, 0.1630482412460662], [680, 0.1744014659602438], [720, 0.16721109030793133], [760, 0.1618531649603633], [800, 0.1717722981316974], [840, 0.20306337887901843], [880, 0.16774887463649762], [920, 0.16047882723180495], [960, 0.1718121340078875], [1000, 0.16647412659841454], [1040, 0.16627494721746405], [1080, 0.158028920846114], [1120, 0.17013902720790344], [1160, 0.16507987093176116], [1200, 0.1571525315699319], [1240, 0.16207226227940882], [1280, 0.15896506393658127], [1320, 0.16002071465561885], [1360, 0.15609688085089432], [1400, 0.15731187507469227], [1440, 0.15398557941281918], [1480, 0.15488188662709637], [1520, 0.16701191092698084], [1560, 0.1550213121937617], [1600, 0.16153447795084253], [1640, 0.15798908496992392], [1680, 0.15338804126996775], [1720, 0.15639564992232005], [1760, 0.1494642074652432], [1800, 0.1552603274509023], [1840, 0.15306935426044696], [1880, 0.14928494602238776], [1920, 0.15533999920328248], [1960, 0.15281042106521134], [2000, 0.15549934270804286], [2040, 0.1558379476556587], [2080, 0.15653507548898538], [2120, 0.1721308210174083], [2160, 0.15273074931283115], [2200, 0.15388598972234394], [2240, 0.15388598972234394], [2280, 0.14956379715571844], [2320, 0.14729315221288292], [2360, 0.14998207385571447], [2400, 0.15091821694618174], [2440, 0.15030076086523522], [2480, 0.14948412540333825], [2520, 0.14801019798430468], [2560, 0.14828904911763535], [2600, 0.1574114647651675], [2640, 0.1489264231366769], [2680, 0.1456399633509939], [2720, 0.14918535633191252], [2760, 0.14914552045572244], [2800, 0.14731307015097797], [2840, 0.1519738676652193], [2880, 0.14862765406525116], [2920, 0.14713380870812254], [2960, 0.14512209696052264], [3000, 0.14994223797952436], [3040, 0.1444050511891009], [3080, 0.14619766561765526], [3120, 0.14412620005577023], [3160, 0.15169501653188863], [3200, 0.14542086603194837], [3240, 0.1463769270605107], [3280, 0.14510217902242759], [3320, 0.1490459307652472], [3360, 0.1475720033462136], [3400, 0.14546070190813848], [3440, 0.14663586025574632], [3480, 0.14356849778910888], [3520, 0.14536111221766324], [3560, 0.14470382026052664], [3600, 0.14358841572720393], [3640, 0.14261243676054655], [3680, 0.14034179181771103], [3720, 0.14532127634147313], [3760, 0.1412380990319882], [3800, 0.14663586025574632], [3840, 0.14000318687009522], [3880, 0.14277178026530693], [3920, 0.143209974903398], [3960, 0.14002310480819025], [4000, 0.14079990439389714], [4040, 0.14510217902242759], [4080, 0.14153686810341393], [4120, 0.14143727841293868], [4160, 0.14034179181771103], [4200, 0.13968449986057443], [4240, 0.13906704377962792], [4280, 0.1386089312034418], [4320, 0.13739393697964386], [4360, 0.13990359717961998], [4400, 0.13799147512249532], [4440, 0.13986376130342987], [4480, 0.140740150579612], [4520, 0.1371549217225033], [4560, 0.13986376130342987], [4600, 0.13613910687965583], [4640, 0.13874835677010716], [4680, 0.1351830458510935], [4720, 0.13735410110345378], [4760, 0.13713500378440824], [4800, 0.13454567183205193], [4840, 0.13763295223678446], [4880, 0.13693582440345775], [4920, 0.13540214317013902], [4960, 0.13719475759869337], [5000, 0.13562124048918456], [5040, 0.1350635382225232], [5080, 0.13410747719396088], [5120, 0.13554156873680437], [5160, 0.13476476915109747], [5200, 0.13580050193204], [5240, 0.13558140461299445], [5280, 0.13597976337489542], [5320, 0.1357606660558499], [5360, 0.13468509739871729], [5400, 0.13504362028442815], [5440, 0.1356013225510895], [5480, 0.13540214317013902], [5520, 0.13484444090347766], [5550, 0.1288309286691195]]