File size: 3,699 Bytes
ee5c72e
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
---
library_name: transformers
license: mit
base_model: flax-community/indonesian-roberta-base
tags:
- generated_from_trainer
metrics:
- accuracy
- precision
- recall
- f1
model-index:
- name: indonesian-roberta-hate-speech-detection
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# indonesian-roberta-hate-speech-detection

This model is a fine-tuned version of [flax-community/indonesian-roberta-base](https://huggingface.co/flax-community/indonesian-roberta-base) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4118
- Accuracy: 0.826
- Auc: 0.903
- Precision: 0.8257
- Recall: 0.8257
- F1: 0.8256

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- num_epochs: 20

### Training results

| Training Loss | Epoch | Step | Validation Loss | Accuracy | Auc   | Precision | Recall | F1     |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:-----:|:---------:|:------:|:------:|
| 0.6653        | 1.0   | 188  | 0.5515          | 0.724    | 0.798 | 0.7243    | 0.7237 | 0.7238 |
| 0.5131        | 2.0   | 376  | 0.4578          | 0.783    | 0.875 | 0.7874    | 0.7830 | 0.7815 |
| 0.4307        | 3.0   | 564  | 0.4418          | 0.795    | 0.904 | 0.8137    | 0.7949 | 0.7928 |
| 0.3668        | 4.0   | 752  | 0.3804          | 0.835    | 0.916 | 0.8354    | 0.8349 | 0.8349 |
| 0.2991        | 5.0   | 940  | 0.4464          | 0.82     | 0.918 | 0.8345    | 0.8202 | 0.8190 |
| 0.2467        | 6.0   | 1128 | 0.4350          | 0.838    | 0.926 | 0.8482    | 0.8382 | 0.8375 |
| 0.2064        | 7.0   | 1316 | 0.4284          | 0.844    | 0.93  | 0.8515    | 0.8442 | 0.8438 |
| 0.1602        | 8.0   | 1504 | 0.4545          | 0.851    | 0.928 | 0.8520    | 0.8509 | 0.8509 |
| 0.1476        | 9.0   | 1692 | 0.4832          | 0.848    | 0.933 | 0.8545    | 0.8482 | 0.8479 |
| 0.117         | 10.0  | 1880 | 0.4632          | 0.855    | 0.932 | 0.8549    | 0.8549 | 0.8549 |
| 0.0925        | 11.0  | 2068 | 0.5501          | 0.86     | 0.933 | 0.8617    | 0.8602 | 0.8602 |
| 0.0811        | 12.0  | 2256 | 0.6097          | 0.849    | 0.933 | 0.8528    | 0.8489 | 0.8487 |
| 0.0702        | 13.0  | 2444 | 0.6568          | 0.85     | 0.933 | 0.8525    | 0.8495 | 0.8494 |
| 0.0588        | 14.0  | 2632 | 0.7092          | 0.844    | 0.93  | 0.8484    | 0.8442 | 0.8440 |
| 0.0572        | 15.0  | 2820 | 0.7012          | 0.855    | 0.933 | 0.8575    | 0.8549 | 0.8548 |
| 0.047         | 16.0  | 3008 | 0.7549          | 0.851    | 0.933 | 0.8545    | 0.8509 | 0.8507 |
| 0.0433        | 17.0  | 3196 | 0.7754          | 0.856    | 0.932 | 0.8593    | 0.8562 | 0.8561 |
| 0.044         | 18.0  | 3384 | 0.7868          | 0.85     | 0.932 | 0.8548    | 0.8502 | 0.8500 |
| 0.0408        | 19.0  | 3572 | 0.7890          | 0.855    | 0.932 | 0.8589    | 0.8549 | 0.8547 |
| 0.0412        | 20.0  | 3760 | 0.7870          | 0.855    | 0.932 | 0.8589    | 0.8549 | 0.8547 |


### Framework versions

- Transformers 4.52.3
- Pytorch 2.6.0+cu124
- Datasets 3.6.0
- Tokenizers 0.21.1