f7a9c47619734b98bc7b8fa773b4e726
This model is a fine-tuned version of google/long-t5-tglobal-xl on the Helsinki-NLP/opus_books [fr-nl] dataset. It achieves the following results on the evaluation set:
- Loss: 1.0939
- Data Size: 1.0
- Epoch Runtime: 546.1156
- Bleu: 7.9002
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- distributed_type: multi-GPU
- num_devices: 4
- total_train_batch_size: 32
- total_eval_batch_size: 32
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: constant
- num_epochs: 50
Training results
| Training Loss | Epoch | Step | Validation Loss | Data Size | Epoch Runtime | Bleu |
|---|---|---|---|---|---|---|
| No log | 0 | 0 | 3.4278 | 0 | 40.9774 | 0.2095 |
| No log | 1 | 1000 | 2.5764 | 0.0078 | 45.0469 | 0.9001 |
| No log | 2 | 2000 | 2.3949 | 0.0156 | 53.7006 | 1.0212 |
| No log | 3 | 3000 | 2.2016 | 0.0312 | 65.0629 | 1.3964 |
| 0.0964 | 4 | 4000 | 2.0627 | 0.0625 | 81.2244 | 1.8657 |
| 2.3311 | 5 | 5000 | 1.9021 | 0.125 | 109.4563 | 2.4578 |
| 0.1257 | 6 | 6000 | 1.7099 | 0.25 | 172.1917 | 3.2116 |
| 0.1584 | 7 | 7000 | 1.5225 | 0.5 | 293.2010 | 4.1032 |
| 1.4893 | 8.0 | 8000 | 1.3268 | 1.0 | 538.2769 | 5.2577 |
| 1.311 | 9.0 | 9000 | 1.2197 | 1.0 | 543.0087 | 6.0068 |
| 1.1839 | 10.0 | 10000 | 1.1529 | 1.0 | 535.1504 | 6.5436 |
| 1.0855 | 11.0 | 11000 | 1.1144 | 1.0 | 544.5768 | 6.8595 |
| 0.9977 | 12.0 | 12000 | 1.0864 | 1.0 | 539.8519 | 7.1574 |
| 0.9255 | 13.0 | 13000 | 1.0694 | 1.0 | 540.7354 | 7.4193 |
| 0.8519 | 14.0 | 14000 | 1.0626 | 1.0 | 537.0915 | 7.5882 |
| 0.7797 | 15.0 | 15000 | 1.0628 | 1.0 | 548.4139 | 7.7185 |
| 0.7195 | 16.0 | 16000 | 1.0635 | 1.0 | 538.1533 | 7.7641 |
| 0.6479 | 17.0 | 17000 | 1.0742 | 1.0 | 543.9543 | 7.8529 |
| 0.6135 | 18.0 | 18000 | 1.0939 | 1.0 | 546.1156 | 7.9002 |
Framework versions
- Transformers 4.57.0
- Pytorch 2.8.0+cu128
- Datasets 4.2.0
- Tokenizers 0.22.1
- Downloads last month
- 9
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for contemmcm/f7a9c47619734b98bc7b8fa773b4e726
Base model
google/long-t5-tglobal-xl