mms_eng_yor
This model is a fine-tuned version of facebook/mms-1b-all on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.6192
- Wer: 0.5316
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 10
- mixed_precision_training: Native AMP
Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|---|---|---|---|---|
| 10.7343 | 0.2436 | 100 | 4.3854 | 1.0 |
| 3.1325 | 0.4872 | 200 | 2.1582 | 0.9691 |
| 1.6166 | 0.7308 | 300 | 1.1347 | 0.7188 |
| 1.1808 | 0.9744 | 400 | 0.9792 | 0.6747 |
| 1.0459 | 1.2168 | 500 | 0.9050 | 0.6494 |
| 1.0317 | 1.4604 | 600 | 0.8543 | 0.6338 |
| 0.9836 | 1.7040 | 700 | 0.8191 | 0.6252 |
| 0.9567 | 1.9476 | 800 | 0.7955 | 0.6124 |
| 0.9354 | 2.1900 | 900 | 0.7705 | 0.6046 |
| 0.9037 | 2.4336 | 1000 | 0.7526 | 0.5982 |
| 0.901 | 2.6772 | 1100 | 0.7370 | 0.5960 |
| 0.8888 | 2.9208 | 1200 | 0.7251 | 0.5878 |
| 0.8686 | 3.1632 | 1300 | 0.7125 | 0.5834 |
| 0.8681 | 3.4068 | 1400 | 0.7030 | 0.5770 |
| 0.8428 | 3.6504 | 1500 | 0.6939 | 0.5729 |
| 0.8372 | 3.8940 | 1600 | 0.6849 | 0.5707 |
| 0.8388 | 4.1364 | 1700 | 0.6779 | 0.5667 |
| 0.8222 | 4.3800 | 1800 | 0.6727 | 0.5621 |
| 0.8289 | 4.6236 | 1900 | 0.6665 | 0.5570 |
| 0.8189 | 4.8672 | 2000 | 0.6623 | 0.5564 |
| 0.8073 | 5.1096 | 2100 | 0.6582 | 0.5535 |
| 0.8 | 5.3532 | 2200 | 0.6532 | 0.5505 |
| 0.8051 | 5.5968 | 2300 | 0.6487 | 0.5461 |
| 0.7897 | 5.8404 | 2400 | 0.6454 | 0.5444 |
| 0.7723 | 6.0828 | 2500 | 0.6421 | 0.5446 |
| 0.7805 | 6.3264 | 2600 | 0.6393 | 0.5413 |
| 0.7974 | 6.5700 | 2700 | 0.6365 | 0.5396 |
| 0.7794 | 6.8136 | 2800 | 0.6344 | 0.5392 |
| 0.7676 | 7.0560 | 2900 | 0.6326 | 0.5389 |
| 0.7627 | 7.2996 | 3000 | 0.6305 | 0.5393 |
| 0.7881 | 7.5432 | 3100 | 0.6282 | 0.5379 |
| 0.7689 | 7.7868 | 3200 | 0.6267 | 0.5342 |
| 0.7784 | 8.0292 | 3300 | 0.6253 | 0.5370 |
| 0.7643 | 8.2728 | 3400 | 0.6245 | 0.5345 |
| 0.7817 | 8.5164 | 3500 | 0.6230 | 0.5351 |
| 0.7508 | 8.7600 | 3600 | 0.6218 | 0.5342 |
| 0.7772 | 9.0049 | 3700 | 0.6209 | 0.5334 |
| 0.7624 | 9.2485 | 3800 | 0.6201 | 0.5328 |
| 0.7694 | 9.4921 | 3900 | 0.6196 | 0.5313 |
| 0.7593 | 9.7357 | 4000 | 0.6194 | 0.5308 |
| 0.7585 | 9.9793 | 4100 | 0.6192 | 0.5316 |
Framework versions
- Transformers 4.52.0.dev0
- Pytorch 2.6.0+cu124
- Datasets 3.6.0
- Tokenizers 0.21.1
- Downloads last month
- 1
Model tree for oluwagbotty/mms_eng_yor
Base model
facebook/mms-1b-all