File size: 4,417 Bytes
4e98507
 
 
 
 
 
292a274
 
4e98507
 
 
 
 
 
 
 
 
 
6f06f3e
4e98507
2a2ae4d
 
 
4e98507
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
2a8a65c
92d3d78
 
4e98507
6f06f3e
4e98507
9b52994
4e98507
 
 
f6b3a02
 
2a2ae4d
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
4e98507
 
 
 
6f06f3e
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
---
library_name: transformers
license: mit
base_model: roberta-base
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: cwe-parent-vulnerability-classification-roberta-base
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# cwe-parent-vulnerability-classification-roberta-base

This model is a fine-tuned version of [roberta-base](https://huggingface.co/roberta-base) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 1.6858
- Accuracy: 0.6126
- F1 Macro: 0.3737

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 40

### Training results

| Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 Macro |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:--------:|
| 3.078         | 1.0   | 237  | 3.0510          | 0.1776   | 0.0529   |
| 2.4726        | 2.0   | 474  | 2.2886          | 0.4398   | 0.2407   |
| 2.2031        | 3.0   | 711  | 1.9511          | 0.5185   | 0.3141   |
| 1.7872        | 4.0   | 948  | 1.7893          | 0.5638   | 0.3511   |
| 1.4324        | 5.0   | 1185 | 1.7492          | 0.6305   | 0.3805   |
| 1.2675        | 6.0   | 1422 | 1.6858          | 0.6126   | 0.3737   |
| 1.0437        | 7.0   | 1659 | 1.7359          | 0.6675   | 0.4296   |
| 0.8699        | 8.0   | 1896 | 1.7641          | 0.6746   | 0.4246   |
| 0.8832        | 9.0   | 2133 | 1.8097          | 0.6746   | 0.4444   |
| 0.8027        | 10.0  | 2370 | 1.8753          | 0.6698   | 0.4380   |
| 0.4583        | 11.0  | 2607 | 1.8919          | 0.6830   | 0.4473   |
| 0.5493        | 12.0  | 2844 | 1.8456          | 0.7080   | 0.4915   |
| 0.4808        | 13.0  | 3081 | 1.9593          | 0.6841   | 0.4555   |
| 0.4466        | 14.0  | 3318 | 2.0736          | 0.6865   | 0.4454   |
| 0.2989        | 15.0  | 3555 | 2.1972          | 0.6961   | 0.4474   |
| 0.255         | 16.0  | 3792 | 2.2513          | 0.7008   | 0.4638   |
| 0.2474        | 17.0  | 4029 | 2.2991          | 0.7223   | 0.4609   |
| 0.1648        | 18.0  | 4266 | 2.4582          | 0.7128   | 0.4614   |
| 0.2112        | 19.0  | 4503 | 2.5944          | 0.7247   | 0.4714   |
| 0.1185        | 20.0  | 4740 | 2.5292          | 0.7128   | 0.4557   |
| 0.1453        | 21.0  | 4977 | 2.6173          | 0.7104   | 0.4466   |
| 0.1126        | 22.0  | 5214 | 2.7072          | 0.7104   | 0.4461   |
| 0.0872        | 23.0  | 5451 | 2.8997          | 0.7235   | 0.4577   |
| 0.0768        | 24.0  | 5688 | 2.8199          | 0.7294   | 0.4623   |
| 0.0643        | 25.0  | 5925 | 2.9228          | 0.7211   | 0.4587   |
| 0.0828        | 26.0  | 6162 | 3.0185          | 0.7330   | 0.4774   |
| 0.0407        | 27.0  | 6399 | 3.1037          | 0.7211   | 0.4586   |
| 0.0386        | 28.0  | 6636 | 3.1938          | 0.7235   | 0.4622   |
| 0.0321        | 29.0  | 6873 | 3.2786          | 0.7318   | 0.4612   |
| 0.0189        | 30.0  | 7110 | 3.4453          | 0.7330   | 0.4559   |
| 0.0223        | 31.0  | 7347 | 3.3558          | 0.7366   | 0.4583   |
| 0.0255        | 32.0  | 7584 | 3.3787          | 0.7354   | 0.4682   |
| 0.0123        | 33.0  | 7821 | 3.4288          | 0.7306   | 0.4633   |
| 0.0128        | 34.0  | 8058 | 3.4361          | 0.7366   | 0.4645   |
| 0.0201        | 35.0  | 8295 | 3.6213          | 0.7235   | 0.4559   |
| 0.014         | 36.0  | 8532 | 3.7080          | 0.7247   | 0.4554   |
| 0.0159        | 37.0  | 8769 | 3.6249          | 0.7330   | 0.4622   |
| 0.027         | 38.0  | 9006 | 3.6598          | 0.7294   | 0.4604   |
| 0.0086        | 39.0  | 9243 | 3.7176          | 0.7342   | 0.4637   |
| 0.0096        | 40.0  | 9480 | 3.7223          | 0.7306   | 0.4614   |


### Framework versions

- Transformers 4.57.1
- Pytorch 2.9.1+cu128
- Datasets 4.4.1
- Tokenizers 0.22.1