roberta_sentiment_detection
This model is a fine-tuned version of FacebookAI/roberta-base on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.0687
- Exact Match Accuracy: 0.3028
- F1 Micro: 0.4447
- F1 Macro: 0.3193
- Roc Auc Macro: 0.8205
- Avg Precision Macro: 0.5004
- F1 Class 0: 0.1963
- F1 Class 1: 0.2745
- F1 Class 2: 0.5010
- F1 Class 3: 0.0526
- F1 Class 4: 0.6405
- F1 Class 5: 0.5008
- F1 Class 6: 0.0697
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 192
- eval_batch_size: 384
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10
- mixed_precision_training: Native AMP
Training results
| Training Loss | Epoch | Step | Validation Loss | Exact Match Accuracy | F1 Micro | F1 Macro | Roc Auc Macro | Avg Precision Macro | F1 Class 0 | F1 Class 1 | F1 Class 2 | F1 Class 3 | F1 Class 4 | F1 Class 5 | F1 Class 6 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| No log | 1.0 | 181 | 0.0608 | 0.0089 | 0.0171 | 0.0547 | 0.7791 | 0.4369 | 0.0 | 0.0 | 0.3830 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.1293 | 2.0 | 362 | 0.0560 | 0.0263 | 0.0473 | 0.0933 | 0.8291 | 0.5394 | 0.0 | 0.0 | 0.4752 | 0.0 | 0.0183 | 0.1597 | 0.0 |
| 0.1293 | 3.0 | 543 | 0.0548 | 0.1775 | 0.2832 | 0.1947 | 0.8390 | 0.5507 | 0.0 | 0.0925 | 0.4988 | 0.0 | 0.4706 | 0.3013 | 0.0 |
| 0.0522 | 4.0 | 724 | 0.0559 | 0.1357 | 0.2243 | 0.1949 | 0.8361 | 0.5424 | 0.0 | 0.1292 | 0.5237 | 0.0 | 0.3202 | 0.3911 | 0.0 |
| 0.0522 | 5.0 | 905 | 0.0569 | 0.2144 | 0.3369 | 0.2268 | 0.8381 | 0.5407 | 0.0183 | 0.1503 | 0.5195 | 0.0 | 0.5441 | 0.3346 | 0.0209 |
| 0.0475 | 6.0 | 1086 | 0.0602 | 0.2424 | 0.3755 | 0.2568 | 0.8318 | 0.5309 | 0.0322 | 0.1852 | 0.5233 | 0.0026 | 0.5792 | 0.4505 | 0.0244 |
| 0.0475 | 7.0 | 1267 | 0.0643 | 0.2719 | 0.4104 | 0.2893 | 0.8292 | 0.5172 | 0.1155 | 0.2197 | 0.5055 | 0.0407 | 0.6027 | 0.5169 | 0.0243 |
| 0.0439 | 8.0 | 1448 | 0.0647 | 0.2873 | 0.4290 | 0.2979 | 0.8244 | 0.5123 | 0.1277 | 0.2786 | 0.5210 | 0.0156 | 0.6350 | 0.4730 | 0.0345 |
| 0.0439 | 9.0 | 1629 | 0.0672 | 0.2901 | 0.4315 | 0.3143 | 0.8228 | 0.5068 | 0.1593 | 0.2654 | 0.5155 | 0.0574 | 0.6244 | 0.4964 | 0.0816 |
| 0.0414 | 10.0 | 1810 | 0.0687 | 0.3028 | 0.4447 | 0.3193 | 0.8205 | 0.5004 | 0.1963 | 0.2745 | 0.5010 | 0.0526 | 0.6405 | 0.5008 | 0.0697 |
Framework versions
- Transformers 4.53.3
- Pytorch 2.6.0+cu124
- Datasets 4.1.1
- Tokenizers 0.21.2
- Downloads last month
- 18
Model tree for lindtsey/roberta_sentiment_detection
Base model
FacebookAI/roberta-base