maanasharma5 commited on
Commit
8259ffb
·
verified ·
1 Parent(s): bd29a0d

Upload folder using huggingface_hub

Browse files
README.md ADDED
@@ -0,0 +1,202 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ base_model: gpt2-medium
3
+ library_name: peft
4
+ ---
5
+
6
+ # Model Card for Model ID
7
+
8
+ <!-- Provide a quick summary of what the model is/does. -->
9
+
10
+
11
+
12
+ ## Model Details
13
+
14
+ ### Model Description
15
+
16
+ <!-- Provide a longer summary of what this model is. -->
17
+
18
+
19
+
20
+ - **Developed by:** [More Information Needed]
21
+ - **Funded by [optional]:** [More Information Needed]
22
+ - **Shared by [optional]:** [More Information Needed]
23
+ - **Model type:** [More Information Needed]
24
+ - **Language(s) (NLP):** [More Information Needed]
25
+ - **License:** [More Information Needed]
26
+ - **Finetuned from model [optional]:** [More Information Needed]
27
+
28
+ ### Model Sources [optional]
29
+
30
+ <!-- Provide the basic links for the model. -->
31
+
32
+ - **Repository:** [More Information Needed]
33
+ - **Paper [optional]:** [More Information Needed]
34
+ - **Demo [optional]:** [More Information Needed]
35
+
36
+ ## Uses
37
+
38
+ <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
39
+
40
+ ### Direct Use
41
+
42
+ <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
43
+
44
+ [More Information Needed]
45
+
46
+ ### Downstream Use [optional]
47
+
48
+ <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
49
+
50
+ [More Information Needed]
51
+
52
+ ### Out-of-Scope Use
53
+
54
+ <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
55
+
56
+ [More Information Needed]
57
+
58
+ ## Bias, Risks, and Limitations
59
+
60
+ <!-- This section is meant to convey both technical and sociotechnical limitations. -->
61
+
62
+ [More Information Needed]
63
+
64
+ ### Recommendations
65
+
66
+ <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
67
+
68
+ Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
69
+
70
+ ## How to Get Started with the Model
71
+
72
+ Use the code below to get started with the model.
73
+
74
+ [More Information Needed]
75
+
76
+ ## Training Details
77
+
78
+ ### Training Data
79
+
80
+ <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
81
+
82
+ [More Information Needed]
83
+
84
+ ### Training Procedure
85
+
86
+ <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
87
+
88
+ #### Preprocessing [optional]
89
+
90
+ [More Information Needed]
91
+
92
+
93
+ #### Training Hyperparameters
94
+
95
+ - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
96
+
97
+ #### Speeds, Sizes, Times [optional]
98
+
99
+ <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
100
+
101
+ [More Information Needed]
102
+
103
+ ## Evaluation
104
+
105
+ <!-- This section describes the evaluation protocols and provides the results. -->
106
+
107
+ ### Testing Data, Factors & Metrics
108
+
109
+ #### Testing Data
110
+
111
+ <!-- This should link to a Dataset Card if possible. -->
112
+
113
+ [More Information Needed]
114
+
115
+ #### Factors
116
+
117
+ <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
118
+
119
+ [More Information Needed]
120
+
121
+ #### Metrics
122
+
123
+ <!-- These are the evaluation metrics being used, ideally with a description of why. -->
124
+
125
+ [More Information Needed]
126
+
127
+ ### Results
128
+
129
+ [More Information Needed]
130
+
131
+ #### Summary
132
+
133
+
134
+
135
+ ## Model Examination [optional]
136
+
137
+ <!-- Relevant interpretability work for the model goes here -->
138
+
139
+ [More Information Needed]
140
+
141
+ ## Environmental Impact
142
+
143
+ <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
144
+
145
+ Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
146
+
147
+ - **Hardware Type:** [More Information Needed]
148
+ - **Hours used:** [More Information Needed]
149
+ - **Cloud Provider:** [More Information Needed]
150
+ - **Compute Region:** [More Information Needed]
151
+ - **Carbon Emitted:** [More Information Needed]
152
+
153
+ ## Technical Specifications [optional]
154
+
155
+ ### Model Architecture and Objective
156
+
157
+ [More Information Needed]
158
+
159
+ ### Compute Infrastructure
160
+
161
+ [More Information Needed]
162
+
163
+ #### Hardware
164
+
165
+ [More Information Needed]
166
+
167
+ #### Software
168
+
169
+ [More Information Needed]
170
+
171
+ ## Citation [optional]
172
+
173
+ <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
174
+
175
+ **BibTeX:**
176
+
177
+ [More Information Needed]
178
+
179
+ **APA:**
180
+
181
+ [More Information Needed]
182
+
183
+ ## Glossary [optional]
184
+
185
+ <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
186
+
187
+ [More Information Needed]
188
+
189
+ ## More Information [optional]
190
+
191
+ [More Information Needed]
192
+
193
+ ## Model Card Authors [optional]
194
+
195
+ [More Information Needed]
196
+
197
+ ## Model Card Contact
198
+
199
+ [More Information Needed]
200
+ ### Framework versions
201
+
202
+ - PEFT 0.13.2
adapter_config.json ADDED
@@ -0,0 +1,31 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "alpha_pattern": {},
3
+ "auto_mapping": {
4
+ "base_model_class": "GPT2LMHeadModel",
5
+ "parent_library": "transformers.models.gpt2.modeling_gpt2"
6
+ },
7
+ "base_model_name_or_path": "gpt2-medium",
8
+ "bias": "none",
9
+ "fan_in_fan_out": true,
10
+ "inference_mode": true,
11
+ "init_lora_weights": true,
12
+ "layer_replication": null,
13
+ "layers_pattern": null,
14
+ "layers_to_transform": null,
15
+ "loftq_config": {},
16
+ "lora_alpha": 128,
17
+ "lora_dropout": 0.0,
18
+ "megatron_config": null,
19
+ "megatron_core": "megatron.core",
20
+ "modules_to_save": null,
21
+ "peft_type": "LORA",
22
+ "r": 64,
23
+ "rank_pattern": {},
24
+ "revision": null,
25
+ "target_modules": [
26
+ "c_attn"
27
+ ],
28
+ "task_type": null,
29
+ "use_dora": false,
30
+ "use_rslora": false
31
+ }
adapter_model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:4f11970b99846be2b015b111b8105ad9235347bde7c7903fc0a89b30a0d9fc30
3
+ size 25172088
checkpoint-3125/README.md ADDED
@@ -0,0 +1,202 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ base_model: gpt2-medium
3
+ library_name: peft
4
+ ---
5
+
6
+ # Model Card for Model ID
7
+
8
+ <!-- Provide a quick summary of what the model is/does. -->
9
+
10
+
11
+
12
+ ## Model Details
13
+
14
+ ### Model Description
15
+
16
+ <!-- Provide a longer summary of what this model is. -->
17
+
18
+
19
+
20
+ - **Developed by:** [More Information Needed]
21
+ - **Funded by [optional]:** [More Information Needed]
22
+ - **Shared by [optional]:** [More Information Needed]
23
+ - **Model type:** [More Information Needed]
24
+ - **Language(s) (NLP):** [More Information Needed]
25
+ - **License:** [More Information Needed]
26
+ - **Finetuned from model [optional]:** [More Information Needed]
27
+
28
+ ### Model Sources [optional]
29
+
30
+ <!-- Provide the basic links for the model. -->
31
+
32
+ - **Repository:** [More Information Needed]
33
+ - **Paper [optional]:** [More Information Needed]
34
+ - **Demo [optional]:** [More Information Needed]
35
+
36
+ ## Uses
37
+
38
+ <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
39
+
40
+ ### Direct Use
41
+
42
+ <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
43
+
44
+ [More Information Needed]
45
+
46
+ ### Downstream Use [optional]
47
+
48
+ <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
49
+
50
+ [More Information Needed]
51
+
52
+ ### Out-of-Scope Use
53
+
54
+ <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
55
+
56
+ [More Information Needed]
57
+
58
+ ## Bias, Risks, and Limitations
59
+
60
+ <!-- This section is meant to convey both technical and sociotechnical limitations. -->
61
+
62
+ [More Information Needed]
63
+
64
+ ### Recommendations
65
+
66
+ <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
67
+
68
+ Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
69
+
70
+ ## How to Get Started with the Model
71
+
72
+ Use the code below to get started with the model.
73
+
74
+ [More Information Needed]
75
+
76
+ ## Training Details
77
+
78
+ ### Training Data
79
+
80
+ <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
81
+
82
+ [More Information Needed]
83
+
84
+ ### Training Procedure
85
+
86
+ <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
87
+
88
+ #### Preprocessing [optional]
89
+
90
+ [More Information Needed]
91
+
92
+
93
+ #### Training Hyperparameters
94
+
95
+ - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
96
+
97
+ #### Speeds, Sizes, Times [optional]
98
+
99
+ <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
100
+
101
+ [More Information Needed]
102
+
103
+ ## Evaluation
104
+
105
+ <!-- This section describes the evaluation protocols and provides the results. -->
106
+
107
+ ### Testing Data, Factors & Metrics
108
+
109
+ #### Testing Data
110
+
111
+ <!-- This should link to a Dataset Card if possible. -->
112
+
113
+ [More Information Needed]
114
+
115
+ #### Factors
116
+
117
+ <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
118
+
119
+ [More Information Needed]
120
+
121
+ #### Metrics
122
+
123
+ <!-- These are the evaluation metrics being used, ideally with a description of why. -->
124
+
125
+ [More Information Needed]
126
+
127
+ ### Results
128
+
129
+ [More Information Needed]
130
+
131
+ #### Summary
132
+
133
+
134
+
135
+ ## Model Examination [optional]
136
+
137
+ <!-- Relevant interpretability work for the model goes here -->
138
+
139
+ [More Information Needed]
140
+
141
+ ## Environmental Impact
142
+
143
+ <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
144
+
145
+ Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
146
+
147
+ - **Hardware Type:** [More Information Needed]
148
+ - **Hours used:** [More Information Needed]
149
+ - **Cloud Provider:** [More Information Needed]
150
+ - **Compute Region:** [More Information Needed]
151
+ - **Carbon Emitted:** [More Information Needed]
152
+
153
+ ## Technical Specifications [optional]
154
+
155
+ ### Model Architecture and Objective
156
+
157
+ [More Information Needed]
158
+
159
+ ### Compute Infrastructure
160
+
161
+ [More Information Needed]
162
+
163
+ #### Hardware
164
+
165
+ [More Information Needed]
166
+
167
+ #### Software
168
+
169
+ [More Information Needed]
170
+
171
+ ## Citation [optional]
172
+
173
+ <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
174
+
175
+ **BibTeX:**
176
+
177
+ [More Information Needed]
178
+
179
+ **APA:**
180
+
181
+ [More Information Needed]
182
+
183
+ ## Glossary [optional]
184
+
185
+ <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
186
+
187
+ [More Information Needed]
188
+
189
+ ## More Information [optional]
190
+
191
+ [More Information Needed]
192
+
193
+ ## Model Card Authors [optional]
194
+
195
+ [More Information Needed]
196
+
197
+ ## Model Card Contact
198
+
199
+ [More Information Needed]
200
+ ### Framework versions
201
+
202
+ - PEFT 0.13.2
checkpoint-3125/adapter_config.json ADDED
@@ -0,0 +1,31 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "alpha_pattern": {},
3
+ "auto_mapping": {
4
+ "base_model_class": "GPT2LMHeadModel",
5
+ "parent_library": "transformers.models.gpt2.modeling_gpt2"
6
+ },
7
+ "base_model_name_or_path": "gpt2-medium",
8
+ "bias": "none",
9
+ "fan_in_fan_out": true,
10
+ "inference_mode": true,
11
+ "init_lora_weights": true,
12
+ "layer_replication": null,
13
+ "layers_pattern": null,
14
+ "layers_to_transform": null,
15
+ "loftq_config": {},
16
+ "lora_alpha": 128,
17
+ "lora_dropout": 0.0,
18
+ "megatron_config": null,
19
+ "megatron_core": "megatron.core",
20
+ "modules_to_save": null,
21
+ "peft_type": "LORA",
22
+ "r": 64,
23
+ "rank_pattern": {},
24
+ "revision": null,
25
+ "target_modules": [
26
+ "c_attn"
27
+ ],
28
+ "task_type": null,
29
+ "use_dora": false,
30
+ "use_rslora": false
31
+ }
checkpoint-3125/adapter_model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:4f11970b99846be2b015b111b8105ad9235347bde7c7903fc0a89b30a0d9fc30
3
+ size 25172088
checkpoint-3125/merges.txt ADDED
The diff for this file is too large to render. See raw diff
 
checkpoint-3125/optimizer.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:5793edcfe4b501f59d86fa80e83559fb0078612a1c95c8c361b5941fdf5b0666
3
+ size 50372538
checkpoint-3125/rng_state.pth ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:feded292b067a53a2aeb0e2a23dd8fd5fd080c27efc2767fe4b430da7e5f7d6f
3
+ size 14244
checkpoint-3125/scheduler.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:e42e57efa8b95889e87232fe6ea0805483ed6c8a2d35bb60230715b624ff5bda
3
+ size 1064
checkpoint-3125/special_tokens_map.json ADDED
@@ -0,0 +1,24 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "bos_token": {
3
+ "content": "<|endoftext|>",
4
+ "lstrip": false,
5
+ "normalized": true,
6
+ "rstrip": false,
7
+ "single_word": false
8
+ },
9
+ "eos_token": {
10
+ "content": "<|endoftext|>",
11
+ "lstrip": false,
12
+ "normalized": true,
13
+ "rstrip": false,
14
+ "single_word": false
15
+ },
16
+ "pad_token": "<|endoftext|>",
17
+ "unk_token": {
18
+ "content": "<|endoftext|>",
19
+ "lstrip": false,
20
+ "normalized": true,
21
+ "rstrip": false,
22
+ "single_word": false
23
+ }
24
+ }
checkpoint-3125/tokenizer_config.json ADDED
@@ -0,0 +1,22 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "add_bos_token": false,
3
+ "add_prefix_space": false,
4
+ "added_tokens_decoder": {
5
+ "50256": {
6
+ "content": "<|endoftext|>",
7
+ "lstrip": false,
8
+ "normalized": true,
9
+ "rstrip": false,
10
+ "single_word": false,
11
+ "special": true
12
+ }
13
+ },
14
+ "bos_token": "<|endoftext|>",
15
+ "clean_up_tokenization_spaces": false,
16
+ "eos_token": "<|endoftext|>",
17
+ "errors": "replace",
18
+ "model_max_length": 1024,
19
+ "pad_token": "<|endoftext|>",
20
+ "tokenizer_class": "GPT2Tokenizer",
21
+ "unk_token": "<|endoftext|>"
22
+ }
checkpoint-3125/trainer_state.json ADDED
@@ -0,0 +1,2217 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "best_metric": null,
3
+ "best_model_checkpoint": null,
4
+ "epoch": 1.0,
5
+ "eval_steps": 500,
6
+ "global_step": 3125,
7
+ "is_hyper_param_search": false,
8
+ "is_local_process_zero": true,
9
+ "is_world_process_zero": true,
10
+ "log_history": [
11
+ {
12
+ "epoch": 0.0032,
13
+ "grad_norm": 10.0,
14
+ "learning_rate": 6.369426751592357e-06,
15
+ "loss": 24.9631,
16
+ "step": 10
17
+ },
18
+ {
19
+ "epoch": 0.0064,
20
+ "grad_norm": 10.0,
21
+ "learning_rate": 1.2738853503184714e-05,
22
+ "loss": 20.8416,
23
+ "step": 20
24
+ },
25
+ {
26
+ "epoch": 0.0096,
27
+ "grad_norm": 9.999999046325684,
28
+ "learning_rate": 1.910828025477707e-05,
29
+ "loss": 18.8244,
30
+ "step": 30
31
+ },
32
+ {
33
+ "epoch": 0.0128,
34
+ "grad_norm": 10.0,
35
+ "learning_rate": 2.5477707006369428e-05,
36
+ "loss": 20.6664,
37
+ "step": 40
38
+ },
39
+ {
40
+ "epoch": 0.016,
41
+ "grad_norm": 9.999999046325684,
42
+ "learning_rate": 3.184713375796178e-05,
43
+ "loss": 17.5711,
44
+ "step": 50
45
+ },
46
+ {
47
+ "epoch": 0.0192,
48
+ "grad_norm": 10.0,
49
+ "learning_rate": 3.821656050955414e-05,
50
+ "loss": 15.9775,
51
+ "step": 60
52
+ },
53
+ {
54
+ "epoch": 0.0224,
55
+ "grad_norm": 10.0,
56
+ "learning_rate": 4.45859872611465e-05,
57
+ "loss": 15.0004,
58
+ "step": 70
59
+ },
60
+ {
61
+ "epoch": 0.0256,
62
+ "grad_norm": 9.999999046325684,
63
+ "learning_rate": 5.0955414012738855e-05,
64
+ "loss": 13.074,
65
+ "step": 80
66
+ },
67
+ {
68
+ "epoch": 0.0288,
69
+ "grad_norm": 9.999998092651367,
70
+ "learning_rate": 5.732484076433121e-05,
71
+ "loss": 10.9575,
72
+ "step": 90
73
+ },
74
+ {
75
+ "epoch": 0.032,
76
+ "grad_norm": 9.999998092651367,
77
+ "learning_rate": 6.369426751592356e-05,
78
+ "loss": 8.7852,
79
+ "step": 100
80
+ },
81
+ {
82
+ "epoch": 0.0352,
83
+ "grad_norm": 9.999998092651367,
84
+ "learning_rate": 7.006369426751592e-05,
85
+ "loss": 8.9422,
86
+ "step": 110
87
+ },
88
+ {
89
+ "epoch": 0.0384,
90
+ "grad_norm": 10.0,
91
+ "learning_rate": 7.643312101910829e-05,
92
+ "loss": 6.8078,
93
+ "step": 120
94
+ },
95
+ {
96
+ "epoch": 0.0416,
97
+ "grad_norm": 10.0,
98
+ "learning_rate": 8.280254777070065e-05,
99
+ "loss": 6.0916,
100
+ "step": 130
101
+ },
102
+ {
103
+ "epoch": 0.0448,
104
+ "grad_norm": 9.999999046325684,
105
+ "learning_rate": 8.9171974522293e-05,
106
+ "loss": 5.8139,
107
+ "step": 140
108
+ },
109
+ {
110
+ "epoch": 0.048,
111
+ "grad_norm": 9.999999046325684,
112
+ "learning_rate": 9.554140127388536e-05,
113
+ "loss": 5.7631,
114
+ "step": 150
115
+ },
116
+ {
117
+ "epoch": 0.0512,
118
+ "grad_norm": 9.999998092651367,
119
+ "learning_rate": 9.989892183288411e-05,
120
+ "loss": 4.6852,
121
+ "step": 160
122
+ },
123
+ {
124
+ "epoch": 0.0544,
125
+ "grad_norm": 9.999999046325684,
126
+ "learning_rate": 9.956199460916442e-05,
127
+ "loss": 4.6054,
128
+ "step": 170
129
+ },
130
+ {
131
+ "epoch": 0.0576,
132
+ "grad_norm": 9.999999046325684,
133
+ "learning_rate": 9.922506738544474e-05,
134
+ "loss": 4.0257,
135
+ "step": 180
136
+ },
137
+ {
138
+ "epoch": 0.0608,
139
+ "grad_norm": 10.0,
140
+ "learning_rate": 9.888814016172507e-05,
141
+ "loss": 4.1777,
142
+ "step": 190
143
+ },
144
+ {
145
+ "epoch": 0.064,
146
+ "grad_norm": 9.999999046325684,
147
+ "learning_rate": 9.85512129380054e-05,
148
+ "loss": 4.0922,
149
+ "step": 200
150
+ },
151
+ {
152
+ "epoch": 0.0672,
153
+ "grad_norm": 9.999998092651367,
154
+ "learning_rate": 9.821428571428572e-05,
155
+ "loss": 3.5944,
156
+ "step": 210
157
+ },
158
+ {
159
+ "epoch": 0.0704,
160
+ "grad_norm": 9.999999046325684,
161
+ "learning_rate": 9.787735849056603e-05,
162
+ "loss": 3.5933,
163
+ "step": 220
164
+ },
165
+ {
166
+ "epoch": 0.0736,
167
+ "grad_norm": 10.0,
168
+ "learning_rate": 9.754043126684636e-05,
169
+ "loss": 3.684,
170
+ "step": 230
171
+ },
172
+ {
173
+ "epoch": 0.0768,
174
+ "grad_norm": 9.999999046325684,
175
+ "learning_rate": 9.720350404312669e-05,
176
+ "loss": 3.1428,
177
+ "step": 240
178
+ },
179
+ {
180
+ "epoch": 0.08,
181
+ "grad_norm": 9.999999046325684,
182
+ "learning_rate": 9.686657681940702e-05,
183
+ "loss": 2.9727,
184
+ "step": 250
185
+ },
186
+ {
187
+ "epoch": 0.0832,
188
+ "grad_norm": 9.999999046325684,
189
+ "learning_rate": 9.652964959568734e-05,
190
+ "loss": 3.2243,
191
+ "step": 260
192
+ },
193
+ {
194
+ "epoch": 0.0864,
195
+ "grad_norm": 9.463090896606445,
196
+ "learning_rate": 9.619272237196765e-05,
197
+ "loss": 2.9269,
198
+ "step": 270
199
+ },
200
+ {
201
+ "epoch": 0.0896,
202
+ "grad_norm": 9.999999046325684,
203
+ "learning_rate": 9.585579514824798e-05,
204
+ "loss": 2.955,
205
+ "step": 280
206
+ },
207
+ {
208
+ "epoch": 0.0928,
209
+ "grad_norm": 9.999999046325684,
210
+ "learning_rate": 9.551886792452831e-05,
211
+ "loss": 2.5544,
212
+ "step": 290
213
+ },
214
+ {
215
+ "epoch": 0.096,
216
+ "grad_norm": 10.0,
217
+ "learning_rate": 9.518194070080863e-05,
218
+ "loss": 2.7417,
219
+ "step": 300
220
+ },
221
+ {
222
+ "epoch": 0.0992,
223
+ "grad_norm": 9.999998092651367,
224
+ "learning_rate": 9.484501347708896e-05,
225
+ "loss": 2.569,
226
+ "step": 310
227
+ },
228
+ {
229
+ "epoch": 0.1024,
230
+ "grad_norm": 9.225275039672852,
231
+ "learning_rate": 9.450808625336927e-05,
232
+ "loss": 2.3499,
233
+ "step": 320
234
+ },
235
+ {
236
+ "epoch": 0.1056,
237
+ "grad_norm": 9.999998092651367,
238
+ "learning_rate": 9.41711590296496e-05,
239
+ "loss": 2.6455,
240
+ "step": 330
241
+ },
242
+ {
243
+ "epoch": 0.1088,
244
+ "grad_norm": 9.915996551513672,
245
+ "learning_rate": 9.383423180592993e-05,
246
+ "loss": 2.3627,
247
+ "step": 340
248
+ },
249
+ {
250
+ "epoch": 0.112,
251
+ "grad_norm": 9.526017189025879,
252
+ "learning_rate": 9.349730458221025e-05,
253
+ "loss": 2.3267,
254
+ "step": 350
255
+ },
256
+ {
257
+ "epoch": 0.1152,
258
+ "grad_norm": 9.999999046325684,
259
+ "learning_rate": 9.316037735849057e-05,
260
+ "loss": 2.1942,
261
+ "step": 360
262
+ },
263
+ {
264
+ "epoch": 0.1184,
265
+ "grad_norm": 10.0,
266
+ "learning_rate": 9.282345013477089e-05,
267
+ "loss": 2.2241,
268
+ "step": 370
269
+ },
270
+ {
271
+ "epoch": 0.1216,
272
+ "grad_norm": 9.999998092651367,
273
+ "learning_rate": 9.248652291105122e-05,
274
+ "loss": 2.1867,
275
+ "step": 380
276
+ },
277
+ {
278
+ "epoch": 0.1248,
279
+ "grad_norm": 6.984477996826172,
280
+ "learning_rate": 9.214959568733154e-05,
281
+ "loss": 2.1133,
282
+ "step": 390
283
+ },
284
+ {
285
+ "epoch": 0.128,
286
+ "grad_norm": 9.999999046325684,
287
+ "learning_rate": 9.181266846361186e-05,
288
+ "loss": 2.1515,
289
+ "step": 400
290
+ },
291
+ {
292
+ "epoch": 0.1312,
293
+ "grad_norm": 9.388639450073242,
294
+ "learning_rate": 9.14757412398922e-05,
295
+ "loss": 2.191,
296
+ "step": 410
297
+ },
298
+ {
299
+ "epoch": 0.1344,
300
+ "grad_norm": 6.955385684967041,
301
+ "learning_rate": 9.113881401617251e-05,
302
+ "loss": 2.05,
303
+ "step": 420
304
+ },
305
+ {
306
+ "epoch": 0.1376,
307
+ "grad_norm": 8.776995658874512,
308
+ "learning_rate": 9.080188679245284e-05,
309
+ "loss": 1.9369,
310
+ "step": 430
311
+ },
312
+ {
313
+ "epoch": 0.1408,
314
+ "grad_norm": 9.813057899475098,
315
+ "learning_rate": 9.046495956873315e-05,
316
+ "loss": 1.8571,
317
+ "step": 440
318
+ },
319
+ {
320
+ "epoch": 0.144,
321
+ "grad_norm": 9.999998092651367,
322
+ "learning_rate": 9.012803234501348e-05,
323
+ "loss": 1.8026,
324
+ "step": 450
325
+ },
326
+ {
327
+ "epoch": 0.1472,
328
+ "grad_norm": 9.999999046325684,
329
+ "learning_rate": 8.979110512129381e-05,
330
+ "loss": 1.6484,
331
+ "step": 460
332
+ },
333
+ {
334
+ "epoch": 0.1504,
335
+ "grad_norm": 6.5526018142700195,
336
+ "learning_rate": 8.945417789757413e-05,
337
+ "loss": 1.601,
338
+ "step": 470
339
+ },
340
+ {
341
+ "epoch": 0.1536,
342
+ "grad_norm": 9.999999046325684,
343
+ "learning_rate": 8.911725067385444e-05,
344
+ "loss": 1.5064,
345
+ "step": 480
346
+ },
347
+ {
348
+ "epoch": 0.1568,
349
+ "grad_norm": 7.3129496574401855,
350
+ "learning_rate": 8.878032345013477e-05,
351
+ "loss": 1.6099,
352
+ "step": 490
353
+ },
354
+ {
355
+ "epoch": 0.16,
356
+ "grad_norm": 8.732329368591309,
357
+ "learning_rate": 8.84433962264151e-05,
358
+ "loss": 1.5446,
359
+ "step": 500
360
+ },
361
+ {
362
+ "epoch": 0.1632,
363
+ "grad_norm": 6.793776512145996,
364
+ "learning_rate": 8.810646900269543e-05,
365
+ "loss": 1.3392,
366
+ "step": 510
367
+ },
368
+ {
369
+ "epoch": 0.1664,
370
+ "grad_norm": 8.43354606628418,
371
+ "learning_rate": 8.776954177897575e-05,
372
+ "loss": 1.3408,
373
+ "step": 520
374
+ },
375
+ {
376
+ "epoch": 0.1696,
377
+ "grad_norm": 7.959011077880859,
378
+ "learning_rate": 8.743261455525606e-05,
379
+ "loss": 1.3075,
380
+ "step": 530
381
+ },
382
+ {
383
+ "epoch": 0.1728,
384
+ "grad_norm": 10.0,
385
+ "learning_rate": 8.709568733153639e-05,
386
+ "loss": 1.4838,
387
+ "step": 540
388
+ },
389
+ {
390
+ "epoch": 0.176,
391
+ "grad_norm": 8.968912124633789,
392
+ "learning_rate": 8.675876010781672e-05,
393
+ "loss": 1.3092,
394
+ "step": 550
395
+ },
396
+ {
397
+ "epoch": 0.1792,
398
+ "grad_norm": 10.0,
399
+ "learning_rate": 8.642183288409704e-05,
400
+ "loss": 1.3005,
401
+ "step": 560
402
+ },
403
+ {
404
+ "epoch": 0.1824,
405
+ "grad_norm": 9.999999046325684,
406
+ "learning_rate": 8.608490566037735e-05,
407
+ "loss": 1.1666,
408
+ "step": 570
409
+ },
410
+ {
411
+ "epoch": 0.1856,
412
+ "grad_norm": 9.999999046325684,
413
+ "learning_rate": 8.574797843665768e-05,
414
+ "loss": 1.292,
415
+ "step": 580
416
+ },
417
+ {
418
+ "epoch": 0.1888,
419
+ "grad_norm": 9.630219459533691,
420
+ "learning_rate": 8.541105121293801e-05,
421
+ "loss": 1.2324,
422
+ "step": 590
423
+ },
424
+ {
425
+ "epoch": 0.192,
426
+ "grad_norm": 9.999999046325684,
427
+ "learning_rate": 8.507412398921834e-05,
428
+ "loss": 1.2701,
429
+ "step": 600
430
+ },
431
+ {
432
+ "epoch": 0.1952,
433
+ "grad_norm": 10.0,
434
+ "learning_rate": 8.473719676549866e-05,
435
+ "loss": 1.1545,
436
+ "step": 610
437
+ },
438
+ {
439
+ "epoch": 0.1984,
440
+ "grad_norm": 8.581560134887695,
441
+ "learning_rate": 8.440026954177897e-05,
442
+ "loss": 1.0769,
443
+ "step": 620
444
+ },
445
+ {
446
+ "epoch": 0.2016,
447
+ "grad_norm": 9.464119911193848,
448
+ "learning_rate": 8.40633423180593e-05,
449
+ "loss": 1.1256,
450
+ "step": 630
451
+ },
452
+ {
453
+ "epoch": 0.2048,
454
+ "grad_norm": 6.58554744720459,
455
+ "learning_rate": 8.372641509433963e-05,
456
+ "loss": 1.0227,
457
+ "step": 640
458
+ },
459
+ {
460
+ "epoch": 0.208,
461
+ "grad_norm": 6.532876491546631,
462
+ "learning_rate": 8.338948787061996e-05,
463
+ "loss": 1.077,
464
+ "step": 650
465
+ },
466
+ {
467
+ "epoch": 0.2112,
468
+ "grad_norm": 7.9927263259887695,
469
+ "learning_rate": 8.305256064690027e-05,
470
+ "loss": 1.0998,
471
+ "step": 660
472
+ },
473
+ {
474
+ "epoch": 0.2144,
475
+ "grad_norm": 5.303004264831543,
476
+ "learning_rate": 8.271563342318059e-05,
477
+ "loss": 0.9585,
478
+ "step": 670
479
+ },
480
+ {
481
+ "epoch": 0.2176,
482
+ "grad_norm": 6.559380531311035,
483
+ "learning_rate": 8.237870619946092e-05,
484
+ "loss": 0.8816,
485
+ "step": 680
486
+ },
487
+ {
488
+ "epoch": 0.2208,
489
+ "grad_norm": 7.447219371795654,
490
+ "learning_rate": 8.204177897574125e-05,
491
+ "loss": 1.1047,
492
+ "step": 690
493
+ },
494
+ {
495
+ "epoch": 0.224,
496
+ "grad_norm": 8.270160675048828,
497
+ "learning_rate": 8.170485175202158e-05,
498
+ "loss": 0.8914,
499
+ "step": 700
500
+ },
501
+ {
502
+ "epoch": 0.2272,
503
+ "grad_norm": 9.999999046325684,
504
+ "learning_rate": 8.136792452830189e-05,
505
+ "loss": 0.9932,
506
+ "step": 710
507
+ },
508
+ {
509
+ "epoch": 0.2304,
510
+ "grad_norm": 8.01986312866211,
511
+ "learning_rate": 8.103099730458221e-05,
512
+ "loss": 0.9273,
513
+ "step": 720
514
+ },
515
+ {
516
+ "epoch": 0.2336,
517
+ "grad_norm": 5.2210917472839355,
518
+ "learning_rate": 8.069407008086254e-05,
519
+ "loss": 1.0231,
520
+ "step": 730
521
+ },
522
+ {
523
+ "epoch": 0.2368,
524
+ "grad_norm": 5.701915740966797,
525
+ "learning_rate": 8.035714285714287e-05,
526
+ "loss": 0.8625,
527
+ "step": 740
528
+ },
529
+ {
530
+ "epoch": 0.24,
531
+ "grad_norm": 8.181998252868652,
532
+ "learning_rate": 8.002021563342318e-05,
533
+ "loss": 0.9228,
534
+ "step": 750
535
+ },
536
+ {
537
+ "epoch": 0.2432,
538
+ "grad_norm": 6.676185131072998,
539
+ "learning_rate": 7.968328840970351e-05,
540
+ "loss": 0.8675,
541
+ "step": 760
542
+ },
543
+ {
544
+ "epoch": 0.2464,
545
+ "grad_norm": 6.564544677734375,
546
+ "learning_rate": 7.934636118598383e-05,
547
+ "loss": 0.9035,
548
+ "step": 770
549
+ },
550
+ {
551
+ "epoch": 0.2496,
552
+ "grad_norm": 9.999998092651367,
553
+ "learning_rate": 7.900943396226416e-05,
554
+ "loss": 0.8019,
555
+ "step": 780
556
+ },
557
+ {
558
+ "epoch": 0.2528,
559
+ "grad_norm": 9.999999046325684,
560
+ "learning_rate": 7.867250673854449e-05,
561
+ "loss": 0.8066,
562
+ "step": 790
563
+ },
564
+ {
565
+ "epoch": 0.256,
566
+ "grad_norm": 10.0,
567
+ "learning_rate": 7.83355795148248e-05,
568
+ "loss": 0.8352,
569
+ "step": 800
570
+ },
571
+ {
572
+ "epoch": 0.2592,
573
+ "grad_norm": 7.660287857055664,
574
+ "learning_rate": 7.799865229110512e-05,
575
+ "loss": 0.8574,
576
+ "step": 810
577
+ },
578
+ {
579
+ "epoch": 0.2624,
580
+ "grad_norm": 4.782571792602539,
581
+ "learning_rate": 7.766172506738545e-05,
582
+ "loss": 0.8104,
583
+ "step": 820
584
+ },
585
+ {
586
+ "epoch": 0.2656,
587
+ "grad_norm": 7.636229991912842,
588
+ "learning_rate": 7.732479784366577e-05,
589
+ "loss": 0.8074,
590
+ "step": 830
591
+ },
592
+ {
593
+ "epoch": 0.2688,
594
+ "grad_norm": 5.22883939743042,
595
+ "learning_rate": 7.69878706199461e-05,
596
+ "loss": 0.7614,
597
+ "step": 840
598
+ },
599
+ {
600
+ "epoch": 0.272,
601
+ "grad_norm": 6.009452819824219,
602
+ "learning_rate": 7.665094339622642e-05,
603
+ "loss": 0.7084,
604
+ "step": 850
605
+ },
606
+ {
607
+ "epoch": 0.2752,
608
+ "grad_norm": 7.293655872344971,
609
+ "learning_rate": 7.631401617250674e-05,
610
+ "loss": 0.7718,
611
+ "step": 860
612
+ },
613
+ {
614
+ "epoch": 0.2784,
615
+ "grad_norm": 5.065558433532715,
616
+ "learning_rate": 7.597708894878706e-05,
617
+ "loss": 0.7292,
618
+ "step": 870
619
+ },
620
+ {
621
+ "epoch": 0.2816,
622
+ "grad_norm": 7.277278423309326,
623
+ "learning_rate": 7.56401617250674e-05,
624
+ "loss": 0.7474,
625
+ "step": 880
626
+ },
627
+ {
628
+ "epoch": 0.2848,
629
+ "grad_norm": 9.6370210647583,
630
+ "learning_rate": 7.530323450134771e-05,
631
+ "loss": 0.7241,
632
+ "step": 890
633
+ },
634
+ {
635
+ "epoch": 0.288,
636
+ "grad_norm": 7.019769191741943,
637
+ "learning_rate": 7.496630727762804e-05,
638
+ "loss": 0.6728,
639
+ "step": 900
640
+ },
641
+ {
642
+ "epoch": 0.2912,
643
+ "grad_norm": 5.834097862243652,
644
+ "learning_rate": 7.462938005390835e-05,
645
+ "loss": 0.7054,
646
+ "step": 910
647
+ },
648
+ {
649
+ "epoch": 0.2944,
650
+ "grad_norm": 9.999998092651367,
651
+ "learning_rate": 7.429245283018868e-05,
652
+ "loss": 0.7325,
653
+ "step": 920
654
+ },
655
+ {
656
+ "epoch": 0.2976,
657
+ "grad_norm": 8.013876914978027,
658
+ "learning_rate": 7.395552560646901e-05,
659
+ "loss": 0.7168,
660
+ "step": 930
661
+ },
662
+ {
663
+ "epoch": 0.3008,
664
+ "grad_norm": 5.732053756713867,
665
+ "learning_rate": 7.361859838274933e-05,
666
+ "loss": 0.5667,
667
+ "step": 940
668
+ },
669
+ {
670
+ "epoch": 0.304,
671
+ "grad_norm": 4.955954551696777,
672
+ "learning_rate": 7.328167115902966e-05,
673
+ "loss": 0.6504,
674
+ "step": 950
675
+ },
676
+ {
677
+ "epoch": 0.3072,
678
+ "grad_norm": 4.867282390594482,
679
+ "learning_rate": 7.294474393530997e-05,
680
+ "loss": 0.5968,
681
+ "step": 960
682
+ },
683
+ {
684
+ "epoch": 0.3104,
685
+ "grad_norm": 6.085322856903076,
686
+ "learning_rate": 7.26078167115903e-05,
687
+ "loss": 0.6032,
688
+ "step": 970
689
+ },
690
+ {
691
+ "epoch": 0.3136,
692
+ "grad_norm": 6.458901882171631,
693
+ "learning_rate": 7.227088948787062e-05,
694
+ "loss": 0.6642,
695
+ "step": 980
696
+ },
697
+ {
698
+ "epoch": 0.3168,
699
+ "grad_norm": 8.118791580200195,
700
+ "learning_rate": 7.193396226415095e-05,
701
+ "loss": 0.6572,
702
+ "step": 990
703
+ },
704
+ {
705
+ "epoch": 0.32,
706
+ "grad_norm": 6.958062171936035,
707
+ "learning_rate": 7.159703504043128e-05,
708
+ "loss": 0.6236,
709
+ "step": 1000
710
+ },
711
+ {
712
+ "epoch": 0.3232,
713
+ "grad_norm": 4.629474639892578,
714
+ "learning_rate": 7.126010781671159e-05,
715
+ "loss": 0.5852,
716
+ "step": 1010
717
+ },
718
+ {
719
+ "epoch": 0.3264,
720
+ "grad_norm": 5.775842189788818,
721
+ "learning_rate": 7.092318059299192e-05,
722
+ "loss": 0.6545,
723
+ "step": 1020
724
+ },
725
+ {
726
+ "epoch": 0.3296,
727
+ "grad_norm": 5.303742408752441,
728
+ "learning_rate": 7.058625336927224e-05,
729
+ "loss": 0.5677,
730
+ "step": 1030
731
+ },
732
+ {
733
+ "epoch": 0.3328,
734
+ "grad_norm": 6.655779838562012,
735
+ "learning_rate": 7.024932614555257e-05,
736
+ "loss": 0.5463,
737
+ "step": 1040
738
+ },
739
+ {
740
+ "epoch": 0.336,
741
+ "grad_norm": 6.062790870666504,
742
+ "learning_rate": 6.99123989218329e-05,
743
+ "loss": 0.5669,
744
+ "step": 1050
745
+ },
746
+ {
747
+ "epoch": 0.3392,
748
+ "grad_norm": 4.374266147613525,
749
+ "learning_rate": 6.957547169811321e-05,
750
+ "loss": 0.5475,
751
+ "step": 1060
752
+ },
753
+ {
754
+ "epoch": 0.3424,
755
+ "grad_norm": 4.560005187988281,
756
+ "learning_rate": 6.923854447439353e-05,
757
+ "loss": 0.5624,
758
+ "step": 1070
759
+ },
760
+ {
761
+ "epoch": 0.3456,
762
+ "grad_norm": 4.675742149353027,
763
+ "learning_rate": 6.890161725067386e-05,
764
+ "loss": 0.5455,
765
+ "step": 1080
766
+ },
767
+ {
768
+ "epoch": 0.3488,
769
+ "grad_norm": 5.191764831542969,
770
+ "learning_rate": 6.856469002695418e-05,
771
+ "loss": 0.5485,
772
+ "step": 1090
773
+ },
774
+ {
775
+ "epoch": 0.352,
776
+ "grad_norm": 6.050549030303955,
777
+ "learning_rate": 6.822776280323451e-05,
778
+ "loss": 0.5439,
779
+ "step": 1100
780
+ },
781
+ {
782
+ "epoch": 0.3552,
783
+ "grad_norm": 7.021573543548584,
784
+ "learning_rate": 6.789083557951483e-05,
785
+ "loss": 0.4899,
786
+ "step": 1110
787
+ },
788
+ {
789
+ "epoch": 0.3584,
790
+ "grad_norm": 5.938823699951172,
791
+ "learning_rate": 6.755390835579514e-05,
792
+ "loss": 0.499,
793
+ "step": 1120
794
+ },
795
+ {
796
+ "epoch": 0.3616,
797
+ "grad_norm": 6.675471305847168,
798
+ "learning_rate": 6.721698113207547e-05,
799
+ "loss": 0.4657,
800
+ "step": 1130
801
+ },
802
+ {
803
+ "epoch": 0.3648,
804
+ "grad_norm": 4.019260883331299,
805
+ "learning_rate": 6.68800539083558e-05,
806
+ "loss": 0.4819,
807
+ "step": 1140
808
+ },
809
+ {
810
+ "epoch": 0.368,
811
+ "grad_norm": 6.338311195373535,
812
+ "learning_rate": 6.654312668463612e-05,
813
+ "loss": 0.4471,
814
+ "step": 1150
815
+ },
816
+ {
817
+ "epoch": 0.3712,
818
+ "grad_norm": 5.028086185455322,
819
+ "learning_rate": 6.620619946091643e-05,
820
+ "loss": 0.4334,
821
+ "step": 1160
822
+ },
823
+ {
824
+ "epoch": 0.3744,
825
+ "grad_norm": 3.6760733127593994,
826
+ "learning_rate": 6.586927223719676e-05,
827
+ "loss": 0.4569,
828
+ "step": 1170
829
+ },
830
+ {
831
+ "epoch": 0.3776,
832
+ "grad_norm": 5.685121059417725,
833
+ "learning_rate": 6.553234501347709e-05,
834
+ "loss": 0.4314,
835
+ "step": 1180
836
+ },
837
+ {
838
+ "epoch": 0.3808,
839
+ "grad_norm": 4.894414901733398,
840
+ "learning_rate": 6.519541778975742e-05,
841
+ "loss": 0.4527,
842
+ "step": 1190
843
+ },
844
+ {
845
+ "epoch": 0.384,
846
+ "grad_norm": 6.624325275421143,
847
+ "learning_rate": 6.485849056603774e-05,
848
+ "loss": 0.3814,
849
+ "step": 1200
850
+ },
851
+ {
852
+ "epoch": 0.3872,
853
+ "grad_norm": 7.109598636627197,
854
+ "learning_rate": 6.452156334231805e-05,
855
+ "loss": 0.4354,
856
+ "step": 1210
857
+ },
858
+ {
859
+ "epoch": 0.3904,
860
+ "grad_norm": 5.322418689727783,
861
+ "learning_rate": 6.418463611859838e-05,
862
+ "loss": 0.4103,
863
+ "step": 1220
864
+ },
865
+ {
866
+ "epoch": 0.3936,
867
+ "grad_norm": 4.165887832641602,
868
+ "learning_rate": 6.384770889487871e-05,
869
+ "loss": 0.414,
870
+ "step": 1230
871
+ },
872
+ {
873
+ "epoch": 0.3968,
874
+ "grad_norm": 2.6662333011627197,
875
+ "learning_rate": 6.351078167115904e-05,
876
+ "loss": 0.4091,
877
+ "step": 1240
878
+ },
879
+ {
880
+ "epoch": 0.4,
881
+ "grad_norm": 4.60746431350708,
882
+ "learning_rate": 6.317385444743936e-05,
883
+ "loss": 0.434,
884
+ "step": 1250
885
+ },
886
+ {
887
+ "epoch": 0.4032,
888
+ "grad_norm": 4.521622180938721,
889
+ "learning_rate": 6.283692722371967e-05,
890
+ "loss": 0.3858,
891
+ "step": 1260
892
+ },
893
+ {
894
+ "epoch": 0.4064,
895
+ "grad_norm": 3.3457305431365967,
896
+ "learning_rate": 6.25e-05,
897
+ "loss": 0.3935,
898
+ "step": 1270
899
+ },
900
+ {
901
+ "epoch": 0.4096,
902
+ "grad_norm": 3.885206699371338,
903
+ "learning_rate": 6.216307277628033e-05,
904
+ "loss": 0.4036,
905
+ "step": 1280
906
+ },
907
+ {
908
+ "epoch": 0.4128,
909
+ "grad_norm": 3.958425998687744,
910
+ "learning_rate": 6.182614555256066e-05,
911
+ "loss": 0.3972,
912
+ "step": 1290
913
+ },
914
+ {
915
+ "epoch": 0.416,
916
+ "grad_norm": 4.6747846603393555,
917
+ "learning_rate": 6.148921832884098e-05,
918
+ "loss": 0.359,
919
+ "step": 1300
920
+ },
921
+ {
922
+ "epoch": 0.4192,
923
+ "grad_norm": 3.6438868045806885,
924
+ "learning_rate": 6.115229110512129e-05,
925
+ "loss": 0.3843,
926
+ "step": 1310
927
+ },
928
+ {
929
+ "epoch": 0.4224,
930
+ "grad_norm": 4.175204753875732,
931
+ "learning_rate": 6.081536388140162e-05,
932
+ "loss": 0.3952,
933
+ "step": 1320
934
+ },
935
+ {
936
+ "epoch": 0.4256,
937
+ "grad_norm": 2.771331548690796,
938
+ "learning_rate": 6.047843665768195e-05,
939
+ "loss": 0.363,
940
+ "step": 1330
941
+ },
942
+ {
943
+ "epoch": 0.4288,
944
+ "grad_norm": 3.504378080368042,
945
+ "learning_rate": 6.0141509433962265e-05,
946
+ "loss": 0.3401,
947
+ "step": 1340
948
+ },
949
+ {
950
+ "epoch": 0.432,
951
+ "grad_norm": 4.495083808898926,
952
+ "learning_rate": 5.980458221024259e-05,
953
+ "loss": 0.335,
954
+ "step": 1350
955
+ },
956
+ {
957
+ "epoch": 0.4352,
958
+ "grad_norm": 3.8098154067993164,
959
+ "learning_rate": 5.9467654986522916e-05,
960
+ "loss": 0.3138,
961
+ "step": 1360
962
+ },
963
+ {
964
+ "epoch": 0.4384,
965
+ "grad_norm": 4.181695938110352,
966
+ "learning_rate": 5.913072776280324e-05,
967
+ "loss": 0.3234,
968
+ "step": 1370
969
+ },
970
+ {
971
+ "epoch": 0.4416,
972
+ "grad_norm": 3.0462260246276855,
973
+ "learning_rate": 5.879380053908357e-05,
974
+ "loss": 0.2771,
975
+ "step": 1380
976
+ },
977
+ {
978
+ "epoch": 0.4448,
979
+ "grad_norm": 3.0168416500091553,
980
+ "learning_rate": 5.8456873315363884e-05,
981
+ "loss": 0.3024,
982
+ "step": 1390
983
+ },
984
+ {
985
+ "epoch": 0.448,
986
+ "grad_norm": 4.196632385253906,
987
+ "learning_rate": 5.8119946091644206e-05,
988
+ "loss": 0.307,
989
+ "step": 1400
990
+ },
991
+ {
992
+ "epoch": 0.4512,
993
+ "grad_norm": 3.4224884510040283,
994
+ "learning_rate": 5.7783018867924535e-05,
995
+ "loss": 0.3261,
996
+ "step": 1410
997
+ },
998
+ {
999
+ "epoch": 0.4544,
1000
+ "grad_norm": 3.340468645095825,
1001
+ "learning_rate": 5.744609164420486e-05,
1002
+ "loss": 0.3116,
1003
+ "step": 1420
1004
+ },
1005
+ {
1006
+ "epoch": 0.4576,
1007
+ "grad_norm": 3.3401427268981934,
1008
+ "learning_rate": 5.710916442048517e-05,
1009
+ "loss": 0.3037,
1010
+ "step": 1430
1011
+ },
1012
+ {
1013
+ "epoch": 0.4608,
1014
+ "grad_norm": 3.3940038681030273,
1015
+ "learning_rate": 5.6772237196765496e-05,
1016
+ "loss": 0.2975,
1017
+ "step": 1440
1018
+ },
1019
+ {
1020
+ "epoch": 0.464,
1021
+ "grad_norm": 2.9820821285247803,
1022
+ "learning_rate": 5.6435309973045825e-05,
1023
+ "loss": 0.288,
1024
+ "step": 1450
1025
+ },
1026
+ {
1027
+ "epoch": 0.4672,
1028
+ "grad_norm": 3.953834056854248,
1029
+ "learning_rate": 5.609838274932615e-05,
1030
+ "loss": 0.2998,
1031
+ "step": 1460
1032
+ },
1033
+ {
1034
+ "epoch": 0.4704,
1035
+ "grad_norm": 3.664151906967163,
1036
+ "learning_rate": 5.5761455525606476e-05,
1037
+ "loss": 0.3029,
1038
+ "step": 1470
1039
+ },
1040
+ {
1041
+ "epoch": 0.4736,
1042
+ "grad_norm": 5.874343395233154,
1043
+ "learning_rate": 5.542452830188679e-05,
1044
+ "loss": 0.308,
1045
+ "step": 1480
1046
+ },
1047
+ {
1048
+ "epoch": 0.4768,
1049
+ "grad_norm": 5.167667388916016,
1050
+ "learning_rate": 5.5087601078167114e-05,
1051
+ "loss": 0.293,
1052
+ "step": 1490
1053
+ },
1054
+ {
1055
+ "epoch": 0.48,
1056
+ "grad_norm": 2.0990707874298096,
1057
+ "learning_rate": 5.4750673854447444e-05,
1058
+ "loss": 0.2736,
1059
+ "step": 1500
1060
+ },
1061
+ {
1062
+ "epoch": 0.4832,
1063
+ "grad_norm": 3.07232403755188,
1064
+ "learning_rate": 5.4413746630727766e-05,
1065
+ "loss": 0.2802,
1066
+ "step": 1510
1067
+ },
1068
+ {
1069
+ "epoch": 0.4864,
1070
+ "grad_norm": 3.340930461883545,
1071
+ "learning_rate": 5.407681940700808e-05,
1072
+ "loss": 0.2773,
1073
+ "step": 1520
1074
+ },
1075
+ {
1076
+ "epoch": 0.4896,
1077
+ "grad_norm": 2.88307523727417,
1078
+ "learning_rate": 5.373989218328841e-05,
1079
+ "loss": 0.2645,
1080
+ "step": 1530
1081
+ },
1082
+ {
1083
+ "epoch": 0.4928,
1084
+ "grad_norm": 3.0353357791900635,
1085
+ "learning_rate": 5.340296495956873e-05,
1086
+ "loss": 0.2864,
1087
+ "step": 1540
1088
+ },
1089
+ {
1090
+ "epoch": 0.496,
1091
+ "grad_norm": 2.4584333896636963,
1092
+ "learning_rate": 5.306603773584906e-05,
1093
+ "loss": 0.2674,
1094
+ "step": 1550
1095
+ },
1096
+ {
1097
+ "epoch": 0.4992,
1098
+ "grad_norm": 2.793869972229004,
1099
+ "learning_rate": 5.2729110512129385e-05,
1100
+ "loss": 0.2425,
1101
+ "step": 1560
1102
+ },
1103
+ {
1104
+ "epoch": 0.5024,
1105
+ "grad_norm": 2.5717570781707764,
1106
+ "learning_rate": 5.23921832884097e-05,
1107
+ "loss": 0.245,
1108
+ "step": 1570
1109
+ },
1110
+ {
1111
+ "epoch": 0.5056,
1112
+ "grad_norm": 2.962009906768799,
1113
+ "learning_rate": 5.205525606469003e-05,
1114
+ "loss": 0.2553,
1115
+ "step": 1580
1116
+ },
1117
+ {
1118
+ "epoch": 0.5088,
1119
+ "grad_norm": 2.053489923477173,
1120
+ "learning_rate": 5.171832884097035e-05,
1121
+ "loss": 0.2338,
1122
+ "step": 1590
1123
+ },
1124
+ {
1125
+ "epoch": 0.512,
1126
+ "grad_norm": 3.5270988941192627,
1127
+ "learning_rate": 5.138140161725068e-05,
1128
+ "loss": 0.2349,
1129
+ "step": 1600
1130
+ },
1131
+ {
1132
+ "epoch": 0.5152,
1133
+ "grad_norm": 3.227064847946167,
1134
+ "learning_rate": 5.1044474393531e-05,
1135
+ "loss": 0.2475,
1136
+ "step": 1610
1137
+ },
1138
+ {
1139
+ "epoch": 0.5184,
1140
+ "grad_norm": 3.002271890640259,
1141
+ "learning_rate": 5.070754716981132e-05,
1142
+ "loss": 0.2264,
1143
+ "step": 1620
1144
+ },
1145
+ {
1146
+ "epoch": 0.5216,
1147
+ "grad_norm": 2.5545215606689453,
1148
+ "learning_rate": 5.037061994609165e-05,
1149
+ "loss": 0.2473,
1150
+ "step": 1630
1151
+ },
1152
+ {
1153
+ "epoch": 0.5248,
1154
+ "grad_norm": 3.1033124923706055,
1155
+ "learning_rate": 5.003369272237197e-05,
1156
+ "loss": 0.228,
1157
+ "step": 1640
1158
+ },
1159
+ {
1160
+ "epoch": 0.528,
1161
+ "grad_norm": 2.90221905708313,
1162
+ "learning_rate": 4.969676549865229e-05,
1163
+ "loss": 0.2337,
1164
+ "step": 1650
1165
+ },
1166
+ {
1167
+ "epoch": 0.5312,
1168
+ "grad_norm": 3.5613458156585693,
1169
+ "learning_rate": 4.9359838274932616e-05,
1170
+ "loss": 0.2322,
1171
+ "step": 1660
1172
+ },
1173
+ {
1174
+ "epoch": 0.5344,
1175
+ "grad_norm": 2.788691282272339,
1176
+ "learning_rate": 4.902291105121294e-05,
1177
+ "loss": 0.2153,
1178
+ "step": 1670
1179
+ },
1180
+ {
1181
+ "epoch": 0.5376,
1182
+ "grad_norm": 2.7116010189056396,
1183
+ "learning_rate": 4.868598382749327e-05,
1184
+ "loss": 0.2249,
1185
+ "step": 1680
1186
+ },
1187
+ {
1188
+ "epoch": 0.5408,
1189
+ "grad_norm": 2.3744070529937744,
1190
+ "learning_rate": 4.834905660377358e-05,
1191
+ "loss": 0.2265,
1192
+ "step": 1690
1193
+ },
1194
+ {
1195
+ "epoch": 0.544,
1196
+ "grad_norm": 2.920132637023926,
1197
+ "learning_rate": 4.801212938005391e-05,
1198
+ "loss": 0.2132,
1199
+ "step": 1700
1200
+ },
1201
+ {
1202
+ "epoch": 0.5472,
1203
+ "grad_norm": 2.0210254192352295,
1204
+ "learning_rate": 4.7675202156334234e-05,
1205
+ "loss": 0.2061,
1206
+ "step": 1710
1207
+ },
1208
+ {
1209
+ "epoch": 0.5504,
1210
+ "grad_norm": 2.7059290409088135,
1211
+ "learning_rate": 4.733827493261456e-05,
1212
+ "loss": 0.1875,
1213
+ "step": 1720
1214
+ },
1215
+ {
1216
+ "epoch": 0.5536,
1217
+ "grad_norm": 2.262721061706543,
1218
+ "learning_rate": 4.7001347708894886e-05,
1219
+ "loss": 0.2012,
1220
+ "step": 1730
1221
+ },
1222
+ {
1223
+ "epoch": 0.5568,
1224
+ "grad_norm": 3.484440326690674,
1225
+ "learning_rate": 4.66644204851752e-05,
1226
+ "loss": 0.2325,
1227
+ "step": 1740
1228
+ },
1229
+ {
1230
+ "epoch": 0.56,
1231
+ "grad_norm": 2.090310573577881,
1232
+ "learning_rate": 4.632749326145553e-05,
1233
+ "loss": 0.2051,
1234
+ "step": 1750
1235
+ },
1236
+ {
1237
+ "epoch": 0.5632,
1238
+ "grad_norm": 2.2044506072998047,
1239
+ "learning_rate": 4.5990566037735846e-05,
1240
+ "loss": 0.1883,
1241
+ "step": 1760
1242
+ },
1243
+ {
1244
+ "epoch": 0.5664,
1245
+ "grad_norm": 2.702648878097534,
1246
+ "learning_rate": 4.5653638814016176e-05,
1247
+ "loss": 0.2145,
1248
+ "step": 1770
1249
+ },
1250
+ {
1251
+ "epoch": 0.5696,
1252
+ "grad_norm": 9.999999046325684,
1253
+ "learning_rate": 4.53167115902965e-05,
1254
+ "loss": 0.2058,
1255
+ "step": 1780
1256
+ },
1257
+ {
1258
+ "epoch": 0.5728,
1259
+ "grad_norm": 5.150343894958496,
1260
+ "learning_rate": 4.497978436657682e-05,
1261
+ "loss": 0.2181,
1262
+ "step": 1790
1263
+ },
1264
+ {
1265
+ "epoch": 0.576,
1266
+ "grad_norm": 1.4523239135742188,
1267
+ "learning_rate": 4.464285714285715e-05,
1268
+ "loss": 0.1986,
1269
+ "step": 1800
1270
+ },
1271
+ {
1272
+ "epoch": 0.5792,
1273
+ "grad_norm": 2.1557908058166504,
1274
+ "learning_rate": 4.4305929919137465e-05,
1275
+ "loss": 0.1908,
1276
+ "step": 1810
1277
+ },
1278
+ {
1279
+ "epoch": 0.5824,
1280
+ "grad_norm": 2.1338348388671875,
1281
+ "learning_rate": 4.3969002695417794e-05,
1282
+ "loss": 0.185,
1283
+ "step": 1820
1284
+ },
1285
+ {
1286
+ "epoch": 0.5856,
1287
+ "grad_norm": 2.266662836074829,
1288
+ "learning_rate": 4.363207547169812e-05,
1289
+ "loss": 0.1899,
1290
+ "step": 1830
1291
+ },
1292
+ {
1293
+ "epoch": 0.5888,
1294
+ "grad_norm": 1.9031009674072266,
1295
+ "learning_rate": 4.329514824797844e-05,
1296
+ "loss": 0.1799,
1297
+ "step": 1840
1298
+ },
1299
+ {
1300
+ "epoch": 0.592,
1301
+ "grad_norm": 1.731602668762207,
1302
+ "learning_rate": 4.295822102425876e-05,
1303
+ "loss": 0.1965,
1304
+ "step": 1850
1305
+ },
1306
+ {
1307
+ "epoch": 0.5952,
1308
+ "grad_norm": 1.3521720170974731,
1309
+ "learning_rate": 4.2621293800539084e-05,
1310
+ "loss": 0.1998,
1311
+ "step": 1860
1312
+ },
1313
+ {
1314
+ "epoch": 0.5984,
1315
+ "grad_norm": 2.028102397918701,
1316
+ "learning_rate": 4.2284366576819406e-05,
1317
+ "loss": 0.1873,
1318
+ "step": 1870
1319
+ },
1320
+ {
1321
+ "epoch": 0.6016,
1322
+ "grad_norm": 3.0926480293273926,
1323
+ "learning_rate": 4.1947439353099736e-05,
1324
+ "loss": 0.1821,
1325
+ "step": 1880
1326
+ },
1327
+ {
1328
+ "epoch": 0.6048,
1329
+ "grad_norm": 2.1551952362060547,
1330
+ "learning_rate": 4.161051212938006e-05,
1331
+ "loss": 0.1737,
1332
+ "step": 1890
1333
+ },
1334
+ {
1335
+ "epoch": 0.608,
1336
+ "grad_norm": 2.661207437515259,
1337
+ "learning_rate": 4.127358490566038e-05,
1338
+ "loss": 0.1898,
1339
+ "step": 1900
1340
+ },
1341
+ {
1342
+ "epoch": 0.6112,
1343
+ "grad_norm": 2.934997081756592,
1344
+ "learning_rate": 4.09366576819407e-05,
1345
+ "loss": 0.1902,
1346
+ "step": 1910
1347
+ },
1348
+ {
1349
+ "epoch": 0.6144,
1350
+ "grad_norm": 1.7465407848358154,
1351
+ "learning_rate": 4.0599730458221025e-05,
1352
+ "loss": 0.1744,
1353
+ "step": 1920
1354
+ },
1355
+ {
1356
+ "epoch": 0.6176,
1357
+ "grad_norm": 2.341937780380249,
1358
+ "learning_rate": 4.026280323450135e-05,
1359
+ "loss": 0.1749,
1360
+ "step": 1930
1361
+ },
1362
+ {
1363
+ "epoch": 0.6208,
1364
+ "grad_norm": 2.418426513671875,
1365
+ "learning_rate": 3.992587601078167e-05,
1366
+ "loss": 0.164,
1367
+ "step": 1940
1368
+ },
1369
+ {
1370
+ "epoch": 0.624,
1371
+ "grad_norm": 2.408447265625,
1372
+ "learning_rate": 3.9588948787062e-05,
1373
+ "loss": 0.1682,
1374
+ "step": 1950
1375
+ },
1376
+ {
1377
+ "epoch": 0.6272,
1378
+ "grad_norm": 1.976258397102356,
1379
+ "learning_rate": 3.9252021563342315e-05,
1380
+ "loss": 0.1676,
1381
+ "step": 1960
1382
+ },
1383
+ {
1384
+ "epoch": 0.6304,
1385
+ "grad_norm": 2.601335287094116,
1386
+ "learning_rate": 3.8915094339622644e-05,
1387
+ "loss": 0.1572,
1388
+ "step": 1970
1389
+ },
1390
+ {
1391
+ "epoch": 0.6336,
1392
+ "grad_norm": 2.3064210414886475,
1393
+ "learning_rate": 3.8578167115902966e-05,
1394
+ "loss": 0.1545,
1395
+ "step": 1980
1396
+ },
1397
+ {
1398
+ "epoch": 0.6368,
1399
+ "grad_norm": 2.60603666305542,
1400
+ "learning_rate": 3.824123989218329e-05,
1401
+ "loss": 0.1657,
1402
+ "step": 1990
1403
+ },
1404
+ {
1405
+ "epoch": 0.64,
1406
+ "grad_norm": 2.5705275535583496,
1407
+ "learning_rate": 3.790431266846362e-05,
1408
+ "loss": 0.165,
1409
+ "step": 2000
1410
+ },
1411
+ {
1412
+ "epoch": 0.6432,
1413
+ "grad_norm": 3.4823734760284424,
1414
+ "learning_rate": 3.7567385444743934e-05,
1415
+ "loss": 0.1705,
1416
+ "step": 2010
1417
+ },
1418
+ {
1419
+ "epoch": 0.6464,
1420
+ "grad_norm": 1.5383442640304565,
1421
+ "learning_rate": 3.723045822102426e-05,
1422
+ "loss": 0.1502,
1423
+ "step": 2020
1424
+ },
1425
+ {
1426
+ "epoch": 0.6496,
1427
+ "grad_norm": 2.4880733489990234,
1428
+ "learning_rate": 3.689353099730458e-05,
1429
+ "loss": 0.1598,
1430
+ "step": 2030
1431
+ },
1432
+ {
1433
+ "epoch": 0.6528,
1434
+ "grad_norm": 2.6558523178100586,
1435
+ "learning_rate": 3.655660377358491e-05,
1436
+ "loss": 0.1582,
1437
+ "step": 2040
1438
+ },
1439
+ {
1440
+ "epoch": 0.656,
1441
+ "grad_norm": 2.7092580795288086,
1442
+ "learning_rate": 3.621967654986524e-05,
1443
+ "loss": 0.1596,
1444
+ "step": 2050
1445
+ },
1446
+ {
1447
+ "epoch": 0.6592,
1448
+ "grad_norm": 2.1501896381378174,
1449
+ "learning_rate": 3.588274932614555e-05,
1450
+ "loss": 0.1505,
1451
+ "step": 2060
1452
+ },
1453
+ {
1454
+ "epoch": 0.6624,
1455
+ "grad_norm": 1.9364612102508545,
1456
+ "learning_rate": 3.554582210242588e-05,
1457
+ "loss": 0.1498,
1458
+ "step": 2070
1459
+ },
1460
+ {
1461
+ "epoch": 0.6656,
1462
+ "grad_norm": 1.987347960472107,
1463
+ "learning_rate": 3.52088948787062e-05,
1464
+ "loss": 0.1561,
1465
+ "step": 2080
1466
+ },
1467
+ {
1468
+ "epoch": 0.6688,
1469
+ "grad_norm": 2.656722068786621,
1470
+ "learning_rate": 3.4871967654986526e-05,
1471
+ "loss": 0.153,
1472
+ "step": 2090
1473
+ },
1474
+ {
1475
+ "epoch": 0.672,
1476
+ "grad_norm": 2.4392051696777344,
1477
+ "learning_rate": 3.453504043126685e-05,
1478
+ "loss": 0.162,
1479
+ "step": 2100
1480
+ },
1481
+ {
1482
+ "epoch": 0.6752,
1483
+ "grad_norm": 2.220731019973755,
1484
+ "learning_rate": 3.419811320754717e-05,
1485
+ "loss": 0.1627,
1486
+ "step": 2110
1487
+ },
1488
+ {
1489
+ "epoch": 0.6784,
1490
+ "grad_norm": 1.5795049667358398,
1491
+ "learning_rate": 3.3861185983827494e-05,
1492
+ "loss": 0.1652,
1493
+ "step": 2120
1494
+ },
1495
+ {
1496
+ "epoch": 0.6816,
1497
+ "grad_norm": 2.1749913692474365,
1498
+ "learning_rate": 3.3524258760107816e-05,
1499
+ "loss": 0.1606,
1500
+ "step": 2130
1501
+ },
1502
+ {
1503
+ "epoch": 0.6848,
1504
+ "grad_norm": 2.301445960998535,
1505
+ "learning_rate": 3.3187331536388145e-05,
1506
+ "loss": 0.1461,
1507
+ "step": 2140
1508
+ },
1509
+ {
1510
+ "epoch": 0.688,
1511
+ "grad_norm": 2.8436062335968018,
1512
+ "learning_rate": 3.285040431266847e-05,
1513
+ "loss": 0.1498,
1514
+ "step": 2150
1515
+ },
1516
+ {
1517
+ "epoch": 0.6912,
1518
+ "grad_norm": 1.9361361265182495,
1519
+ "learning_rate": 3.251347708894879e-05,
1520
+ "loss": 0.1543,
1521
+ "step": 2160
1522
+ },
1523
+ {
1524
+ "epoch": 0.6944,
1525
+ "grad_norm": 1.9528151750564575,
1526
+ "learning_rate": 3.217654986522911e-05,
1527
+ "loss": 0.1402,
1528
+ "step": 2170
1529
+ },
1530
+ {
1531
+ "epoch": 0.6976,
1532
+ "grad_norm": 2.5773980617523193,
1533
+ "learning_rate": 3.1839622641509435e-05,
1534
+ "loss": 0.1329,
1535
+ "step": 2180
1536
+ },
1537
+ {
1538
+ "epoch": 0.7008,
1539
+ "grad_norm": 2.697664976119995,
1540
+ "learning_rate": 3.150269541778976e-05,
1541
+ "loss": 0.1574,
1542
+ "step": 2190
1543
+ },
1544
+ {
1545
+ "epoch": 0.704,
1546
+ "grad_norm": 2.53369402885437,
1547
+ "learning_rate": 3.1165768194070086e-05,
1548
+ "loss": 0.1513,
1549
+ "step": 2200
1550
+ },
1551
+ {
1552
+ "epoch": 0.7072,
1553
+ "grad_norm": 2.3387765884399414,
1554
+ "learning_rate": 3.08288409703504e-05,
1555
+ "loss": 0.1463,
1556
+ "step": 2210
1557
+ },
1558
+ {
1559
+ "epoch": 0.7104,
1560
+ "grad_norm": 1.6529033184051514,
1561
+ "learning_rate": 3.0491913746630728e-05,
1562
+ "loss": 0.1373,
1563
+ "step": 2220
1564
+ },
1565
+ {
1566
+ "epoch": 0.7136,
1567
+ "grad_norm": 2.692295551300049,
1568
+ "learning_rate": 3.0154986522911054e-05,
1569
+ "loss": 0.1325,
1570
+ "step": 2230
1571
+ },
1572
+ {
1573
+ "epoch": 0.7168,
1574
+ "grad_norm": 2.09934401512146,
1575
+ "learning_rate": 2.9818059299191376e-05,
1576
+ "loss": 0.1484,
1577
+ "step": 2240
1578
+ },
1579
+ {
1580
+ "epoch": 0.72,
1581
+ "grad_norm": 2.7355453968048096,
1582
+ "learning_rate": 2.9481132075471702e-05,
1583
+ "loss": 0.1341,
1584
+ "step": 2250
1585
+ },
1586
+ {
1587
+ "epoch": 0.7232,
1588
+ "grad_norm": 1.8107614517211914,
1589
+ "learning_rate": 2.914420485175202e-05,
1590
+ "loss": 0.1475,
1591
+ "step": 2260
1592
+ },
1593
+ {
1594
+ "epoch": 0.7264,
1595
+ "grad_norm": 1.958337426185608,
1596
+ "learning_rate": 2.8807277628032347e-05,
1597
+ "loss": 0.1438,
1598
+ "step": 2270
1599
+ },
1600
+ {
1601
+ "epoch": 0.7296,
1602
+ "grad_norm": 1.456838607788086,
1603
+ "learning_rate": 2.847035040431267e-05,
1604
+ "loss": 0.1395,
1605
+ "step": 2280
1606
+ },
1607
+ {
1608
+ "epoch": 0.7328,
1609
+ "grad_norm": 1.997288703918457,
1610
+ "learning_rate": 2.8133423180592995e-05,
1611
+ "loss": 0.1329,
1612
+ "step": 2290
1613
+ },
1614
+ {
1615
+ "epoch": 0.736,
1616
+ "grad_norm": 2.1655972003936768,
1617
+ "learning_rate": 2.7796495956873314e-05,
1618
+ "loss": 0.1357,
1619
+ "step": 2300
1620
+ },
1621
+ {
1622
+ "epoch": 0.7392,
1623
+ "grad_norm": 1.8002907037734985,
1624
+ "learning_rate": 2.745956873315364e-05,
1625
+ "loss": 0.1444,
1626
+ "step": 2310
1627
+ },
1628
+ {
1629
+ "epoch": 0.7424,
1630
+ "grad_norm": 2.862435817718506,
1631
+ "learning_rate": 2.7122641509433965e-05,
1632
+ "loss": 0.1455,
1633
+ "step": 2320
1634
+ },
1635
+ {
1636
+ "epoch": 0.7456,
1637
+ "grad_norm": 1.9556299448013306,
1638
+ "learning_rate": 2.6785714285714288e-05,
1639
+ "loss": 0.1455,
1640
+ "step": 2330
1641
+ },
1642
+ {
1643
+ "epoch": 0.7488,
1644
+ "grad_norm": 2.489349365234375,
1645
+ "learning_rate": 2.6448787061994614e-05,
1646
+ "loss": 0.1359,
1647
+ "step": 2340
1648
+ },
1649
+ {
1650
+ "epoch": 0.752,
1651
+ "grad_norm": 2.1165363788604736,
1652
+ "learning_rate": 2.6111859838274933e-05,
1653
+ "loss": 0.1448,
1654
+ "step": 2350
1655
+ },
1656
+ {
1657
+ "epoch": 0.7552,
1658
+ "grad_norm": 1.813704013824463,
1659
+ "learning_rate": 2.577493261455526e-05,
1660
+ "loss": 0.1341,
1661
+ "step": 2360
1662
+ },
1663
+ {
1664
+ "epoch": 0.7584,
1665
+ "grad_norm": 2.7160651683807373,
1666
+ "learning_rate": 2.5438005390835577e-05,
1667
+ "loss": 0.1255,
1668
+ "step": 2370
1669
+ },
1670
+ {
1671
+ "epoch": 0.7616,
1672
+ "grad_norm": 1.3045297861099243,
1673
+ "learning_rate": 2.5101078167115903e-05,
1674
+ "loss": 0.1334,
1675
+ "step": 2380
1676
+ },
1677
+ {
1678
+ "epoch": 0.7648,
1679
+ "grad_norm": 2.3194830417633057,
1680
+ "learning_rate": 2.476415094339623e-05,
1681
+ "loss": 0.1538,
1682
+ "step": 2390
1683
+ },
1684
+ {
1685
+ "epoch": 0.768,
1686
+ "grad_norm": 1.9577467441558838,
1687
+ "learning_rate": 2.442722371967655e-05,
1688
+ "loss": 0.1407,
1689
+ "step": 2400
1690
+ },
1691
+ {
1692
+ "epoch": 0.7712,
1693
+ "grad_norm": 1.4686191082000732,
1694
+ "learning_rate": 2.4090296495956874e-05,
1695
+ "loss": 0.1273,
1696
+ "step": 2410
1697
+ },
1698
+ {
1699
+ "epoch": 0.7744,
1700
+ "grad_norm": 2.1956889629364014,
1701
+ "learning_rate": 2.3753369272237196e-05,
1702
+ "loss": 0.1386,
1703
+ "step": 2420
1704
+ },
1705
+ {
1706
+ "epoch": 0.7776,
1707
+ "grad_norm": 1.6341290473937988,
1708
+ "learning_rate": 2.341644204851752e-05,
1709
+ "loss": 0.1298,
1710
+ "step": 2430
1711
+ },
1712
+ {
1713
+ "epoch": 0.7808,
1714
+ "grad_norm": 2.429835081100464,
1715
+ "learning_rate": 2.3079514824797844e-05,
1716
+ "loss": 0.1283,
1717
+ "step": 2440
1718
+ },
1719
+ {
1720
+ "epoch": 0.784,
1721
+ "grad_norm": 9.999998092651367,
1722
+ "learning_rate": 2.274258760107817e-05,
1723
+ "loss": 0.2458,
1724
+ "step": 2450
1725
+ },
1726
+ {
1727
+ "epoch": 0.7872,
1728
+ "grad_norm": 1.7494025230407715,
1729
+ "learning_rate": 2.2405660377358493e-05,
1730
+ "loss": 0.1338,
1731
+ "step": 2460
1732
+ },
1733
+ {
1734
+ "epoch": 0.7904,
1735
+ "grad_norm": 2.2596070766448975,
1736
+ "learning_rate": 2.2068733153638815e-05,
1737
+ "loss": 0.1208,
1738
+ "step": 2470
1739
+ },
1740
+ {
1741
+ "epoch": 0.7936,
1742
+ "grad_norm": 2.0361554622650146,
1743
+ "learning_rate": 2.1731805929919137e-05,
1744
+ "loss": 0.1403,
1745
+ "step": 2480
1746
+ },
1747
+ {
1748
+ "epoch": 0.7968,
1749
+ "grad_norm": 2.3333733081817627,
1750
+ "learning_rate": 2.1394878706199463e-05,
1751
+ "loss": 0.1222,
1752
+ "step": 2490
1753
+ },
1754
+ {
1755
+ "epoch": 0.8,
1756
+ "grad_norm": 1.4406805038452148,
1757
+ "learning_rate": 2.1057951482479785e-05,
1758
+ "loss": 0.1317,
1759
+ "step": 2500
1760
+ },
1761
+ {
1762
+ "epoch": 0.8032,
1763
+ "grad_norm": 1.7497354745864868,
1764
+ "learning_rate": 2.0721024258760108e-05,
1765
+ "loss": 0.129,
1766
+ "step": 2510
1767
+ },
1768
+ {
1769
+ "epoch": 0.8064,
1770
+ "grad_norm": 1.8802416324615479,
1771
+ "learning_rate": 2.038409703504043e-05,
1772
+ "loss": 0.1305,
1773
+ "step": 2520
1774
+ },
1775
+ {
1776
+ "epoch": 0.8096,
1777
+ "grad_norm": 1.733115792274475,
1778
+ "learning_rate": 2.0047169811320756e-05,
1779
+ "loss": 0.1266,
1780
+ "step": 2530
1781
+ },
1782
+ {
1783
+ "epoch": 0.8128,
1784
+ "grad_norm": 2.586423397064209,
1785
+ "learning_rate": 1.971024258760108e-05,
1786
+ "loss": 0.1336,
1787
+ "step": 2540
1788
+ },
1789
+ {
1790
+ "epoch": 0.816,
1791
+ "grad_norm": 1.6050171852111816,
1792
+ "learning_rate": 1.9373315363881404e-05,
1793
+ "loss": 0.134,
1794
+ "step": 2550
1795
+ },
1796
+ {
1797
+ "epoch": 0.8192,
1798
+ "grad_norm": 1.717938780784607,
1799
+ "learning_rate": 1.9036388140161727e-05,
1800
+ "loss": 0.1252,
1801
+ "step": 2560
1802
+ },
1803
+ {
1804
+ "epoch": 0.8224,
1805
+ "grad_norm": 1.871017575263977,
1806
+ "learning_rate": 1.869946091644205e-05,
1807
+ "loss": 0.1217,
1808
+ "step": 2570
1809
+ },
1810
+ {
1811
+ "epoch": 0.8256,
1812
+ "grad_norm": 2.6902308464050293,
1813
+ "learning_rate": 1.836253369272237e-05,
1814
+ "loss": 0.1235,
1815
+ "step": 2580
1816
+ },
1817
+ {
1818
+ "epoch": 0.8288,
1819
+ "grad_norm": 1.9302878379821777,
1820
+ "learning_rate": 1.8025606469002694e-05,
1821
+ "loss": 0.1303,
1822
+ "step": 2590
1823
+ },
1824
+ {
1825
+ "epoch": 0.832,
1826
+ "grad_norm": 2.0468838214874268,
1827
+ "learning_rate": 1.768867924528302e-05,
1828
+ "loss": 0.1176,
1829
+ "step": 2600
1830
+ },
1831
+ {
1832
+ "epoch": 0.8352,
1833
+ "grad_norm": 1.396986722946167,
1834
+ "learning_rate": 1.7351752021563345e-05,
1835
+ "loss": 0.1163,
1836
+ "step": 2610
1837
+ },
1838
+ {
1839
+ "epoch": 0.8384,
1840
+ "grad_norm": 1.742895245552063,
1841
+ "learning_rate": 1.7014824797843668e-05,
1842
+ "loss": 0.118,
1843
+ "step": 2620
1844
+ },
1845
+ {
1846
+ "epoch": 0.8416,
1847
+ "grad_norm": 1.870067834854126,
1848
+ "learning_rate": 1.667789757412399e-05,
1849
+ "loss": 0.1229,
1850
+ "step": 2630
1851
+ },
1852
+ {
1853
+ "epoch": 0.8448,
1854
+ "grad_norm": 1.7094252109527588,
1855
+ "learning_rate": 1.6340970350404313e-05,
1856
+ "loss": 0.1384,
1857
+ "step": 2640
1858
+ },
1859
+ {
1860
+ "epoch": 0.848,
1861
+ "grad_norm": 2.3981940746307373,
1862
+ "learning_rate": 1.600404312668464e-05,
1863
+ "loss": 0.1107,
1864
+ "step": 2650
1865
+ },
1866
+ {
1867
+ "epoch": 0.8512,
1868
+ "grad_norm": 2.565001964569092,
1869
+ "learning_rate": 1.566711590296496e-05,
1870
+ "loss": 0.1152,
1871
+ "step": 2660
1872
+ },
1873
+ {
1874
+ "epoch": 0.8544,
1875
+ "grad_norm": 2.4875590801239014,
1876
+ "learning_rate": 1.5330188679245283e-05,
1877
+ "loss": 0.1227,
1878
+ "step": 2670
1879
+ },
1880
+ {
1881
+ "epoch": 0.8576,
1882
+ "grad_norm": 2.2726962566375732,
1883
+ "learning_rate": 1.4993261455525606e-05,
1884
+ "loss": 0.1172,
1885
+ "step": 2680
1886
+ },
1887
+ {
1888
+ "epoch": 0.8608,
1889
+ "grad_norm": 1.7912529706954956,
1890
+ "learning_rate": 1.465633423180593e-05,
1891
+ "loss": 0.1124,
1892
+ "step": 2690
1893
+ },
1894
+ {
1895
+ "epoch": 0.864,
1896
+ "grad_norm": 1.7430084943771362,
1897
+ "learning_rate": 1.4319407008086256e-05,
1898
+ "loss": 0.1215,
1899
+ "step": 2700
1900
+ },
1901
+ {
1902
+ "epoch": 0.8672,
1903
+ "grad_norm": 2.2106008529663086,
1904
+ "learning_rate": 1.3982479784366578e-05,
1905
+ "loss": 0.1185,
1906
+ "step": 2710
1907
+ },
1908
+ {
1909
+ "epoch": 0.8704,
1910
+ "grad_norm": 2.9677484035491943,
1911
+ "learning_rate": 1.3645552560646902e-05,
1912
+ "loss": 0.1241,
1913
+ "step": 2720
1914
+ },
1915
+ {
1916
+ "epoch": 0.8736,
1917
+ "grad_norm": 1.9449735879898071,
1918
+ "learning_rate": 1.3308625336927224e-05,
1919
+ "loss": 0.1173,
1920
+ "step": 2730
1921
+ },
1922
+ {
1923
+ "epoch": 0.8768,
1924
+ "grad_norm": 2.0924923419952393,
1925
+ "learning_rate": 1.2971698113207547e-05,
1926
+ "loss": 0.1253,
1927
+ "step": 2740
1928
+ },
1929
+ {
1930
+ "epoch": 0.88,
1931
+ "grad_norm": 2.4807486534118652,
1932
+ "learning_rate": 1.2634770889487871e-05,
1933
+ "loss": 0.109,
1934
+ "step": 2750
1935
+ },
1936
+ {
1937
+ "epoch": 0.8832,
1938
+ "grad_norm": 2.2792012691497803,
1939
+ "learning_rate": 1.2297843665768195e-05,
1940
+ "loss": 0.1183,
1941
+ "step": 2760
1942
+ },
1943
+ {
1944
+ "epoch": 0.8864,
1945
+ "grad_norm": 1.7402423620224,
1946
+ "learning_rate": 1.1960916442048519e-05,
1947
+ "loss": 0.1028,
1948
+ "step": 2770
1949
+ },
1950
+ {
1951
+ "epoch": 0.8896,
1952
+ "grad_norm": 1.4183549880981445,
1953
+ "learning_rate": 1.1623989218328842e-05,
1954
+ "loss": 0.109,
1955
+ "step": 2780
1956
+ },
1957
+ {
1958
+ "epoch": 0.8928,
1959
+ "grad_norm": 2.241910934448242,
1960
+ "learning_rate": 1.1287061994609164e-05,
1961
+ "loss": 0.1084,
1962
+ "step": 2790
1963
+ },
1964
+ {
1965
+ "epoch": 0.896,
1966
+ "grad_norm": 1.7856048345565796,
1967
+ "learning_rate": 1.0950134770889488e-05,
1968
+ "loss": 0.1181,
1969
+ "step": 2800
1970
+ },
1971
+ {
1972
+ "epoch": 0.8992,
1973
+ "grad_norm": 1.8856292963027954,
1974
+ "learning_rate": 1.0613207547169812e-05,
1975
+ "loss": 0.1205,
1976
+ "step": 2810
1977
+ },
1978
+ {
1979
+ "epoch": 0.9024,
1980
+ "grad_norm": 2.5832862854003906,
1981
+ "learning_rate": 1.0276280323450135e-05,
1982
+ "loss": 0.1255,
1983
+ "step": 2820
1984
+ },
1985
+ {
1986
+ "epoch": 0.9056,
1987
+ "grad_norm": 1.7440129518508911,
1988
+ "learning_rate": 9.939353099730459e-06,
1989
+ "loss": 0.1186,
1990
+ "step": 2830
1991
+ },
1992
+ {
1993
+ "epoch": 0.9088,
1994
+ "grad_norm": 1.7217051982879639,
1995
+ "learning_rate": 9.602425876010781e-06,
1996
+ "loss": 0.1159,
1997
+ "step": 2840
1998
+ },
1999
+ {
2000
+ "epoch": 0.912,
2001
+ "grad_norm": 2.0266754627227783,
2002
+ "learning_rate": 9.265498652291107e-06,
2003
+ "loss": 0.1112,
2004
+ "step": 2850
2005
+ },
2006
+ {
2007
+ "epoch": 0.9152,
2008
+ "grad_norm": 1.816345453262329,
2009
+ "learning_rate": 8.92857142857143e-06,
2010
+ "loss": 0.1225,
2011
+ "step": 2860
2012
+ },
2013
+ {
2014
+ "epoch": 0.9184,
2015
+ "grad_norm": 1.9036134481430054,
2016
+ "learning_rate": 8.591644204851752e-06,
2017
+ "loss": 0.1215,
2018
+ "step": 2870
2019
+ },
2020
+ {
2021
+ "epoch": 0.9216,
2022
+ "grad_norm": 1.8858239650726318,
2023
+ "learning_rate": 8.254716981132076e-06,
2024
+ "loss": 0.1015,
2025
+ "step": 2880
2026
+ },
2027
+ {
2028
+ "epoch": 0.9248,
2029
+ "grad_norm": 1.9040420055389404,
2030
+ "learning_rate": 7.9177897574124e-06,
2031
+ "loss": 0.1115,
2032
+ "step": 2890
2033
+ },
2034
+ {
2035
+ "epoch": 0.928,
2036
+ "grad_norm": 2.292043447494507,
2037
+ "learning_rate": 7.580862533692723e-06,
2038
+ "loss": 0.1208,
2039
+ "step": 2900
2040
+ },
2041
+ {
2042
+ "epoch": 0.9312,
2043
+ "grad_norm": 2.022998809814453,
2044
+ "learning_rate": 7.243935309973046e-06,
2045
+ "loss": 0.1041,
2046
+ "step": 2910
2047
+ },
2048
+ {
2049
+ "epoch": 0.9344,
2050
+ "grad_norm": 2.045623779296875,
2051
+ "learning_rate": 6.90700808625337e-06,
2052
+ "loss": 0.1115,
2053
+ "step": 2920
2054
+ },
2055
+ {
2056
+ "epoch": 0.9376,
2057
+ "grad_norm": 1.6464377641677856,
2058
+ "learning_rate": 6.570080862533692e-06,
2059
+ "loss": 0.112,
2060
+ "step": 2930
2061
+ },
2062
+ {
2063
+ "epoch": 0.9408,
2064
+ "grad_norm": 1.6890789270401,
2065
+ "learning_rate": 6.233153638814016e-06,
2066
+ "loss": 0.1081,
2067
+ "step": 2940
2068
+ },
2069
+ {
2070
+ "epoch": 0.944,
2071
+ "grad_norm": 2.4978396892547607,
2072
+ "learning_rate": 5.89622641509434e-06,
2073
+ "loss": 0.1147,
2074
+ "step": 2950
2075
+ },
2076
+ {
2077
+ "epoch": 0.9472,
2078
+ "grad_norm": 1.9748504161834717,
2079
+ "learning_rate": 5.5592991913746634e-06,
2080
+ "loss": 0.1019,
2081
+ "step": 2960
2082
+ },
2083
+ {
2084
+ "epoch": 0.9504,
2085
+ "grad_norm": 2.1371240615844727,
2086
+ "learning_rate": 5.222371967654987e-06,
2087
+ "loss": 0.1114,
2088
+ "step": 2970
2089
+ },
2090
+ {
2091
+ "epoch": 0.9536,
2092
+ "grad_norm": 2.8856780529022217,
2093
+ "learning_rate": 4.88544474393531e-06,
2094
+ "loss": 0.1142,
2095
+ "step": 2980
2096
+ },
2097
+ {
2098
+ "epoch": 0.9568,
2099
+ "grad_norm": 1.8940197229385376,
2100
+ "learning_rate": 4.548517520215634e-06,
2101
+ "loss": 0.1004,
2102
+ "step": 2990
2103
+ },
2104
+ {
2105
+ "epoch": 0.96,
2106
+ "grad_norm": 1.2457224130630493,
2107
+ "learning_rate": 4.211590296495957e-06,
2108
+ "loss": 0.1063,
2109
+ "step": 3000
2110
+ },
2111
+ {
2112
+ "epoch": 0.9632,
2113
+ "grad_norm": 3.265352725982666,
2114
+ "learning_rate": 3.8746630727762805e-06,
2115
+ "loss": 0.1196,
2116
+ "step": 3010
2117
+ },
2118
+ {
2119
+ "epoch": 0.9664,
2120
+ "grad_norm": 2.0506467819213867,
2121
+ "learning_rate": 3.5377358490566038e-06,
2122
+ "loss": 0.1153,
2123
+ "step": 3020
2124
+ },
2125
+ {
2126
+ "epoch": 0.9696,
2127
+ "grad_norm": 1.4757201671600342,
2128
+ "learning_rate": 3.200808625336928e-06,
2129
+ "loss": 0.1068,
2130
+ "step": 3030
2131
+ },
2132
+ {
2133
+ "epoch": 0.9728,
2134
+ "grad_norm": 2.700068473815918,
2135
+ "learning_rate": 2.8638814016172507e-06,
2136
+ "loss": 0.1119,
2137
+ "step": 3040
2138
+ },
2139
+ {
2140
+ "epoch": 0.976,
2141
+ "grad_norm": 2.0698773860931396,
2142
+ "learning_rate": 2.5269541778975744e-06,
2143
+ "loss": 0.1051,
2144
+ "step": 3050
2145
+ },
2146
+ {
2147
+ "epoch": 0.9792,
2148
+ "grad_norm": 1.6540876626968384,
2149
+ "learning_rate": 2.1900269541778976e-06,
2150
+ "loss": 0.1101,
2151
+ "step": 3060
2152
+ },
2153
+ {
2154
+ "epoch": 0.9824,
2155
+ "grad_norm": 1.8899681568145752,
2156
+ "learning_rate": 1.853099730458221e-06,
2157
+ "loss": 0.1118,
2158
+ "step": 3070
2159
+ },
2160
+ {
2161
+ "epoch": 0.9856,
2162
+ "grad_norm": 1.6223585605621338,
2163
+ "learning_rate": 1.5161725067385445e-06,
2164
+ "loss": 0.1087,
2165
+ "step": 3080
2166
+ },
2167
+ {
2168
+ "epoch": 0.9888,
2169
+ "grad_norm": 1.9171777963638306,
2170
+ "learning_rate": 1.179245283018868e-06,
2171
+ "loss": 0.1058,
2172
+ "step": 3090
2173
+ },
2174
+ {
2175
+ "epoch": 0.992,
2176
+ "grad_norm": 1.4021532535552979,
2177
+ "learning_rate": 8.423180592991913e-07,
2178
+ "loss": 0.1044,
2179
+ "step": 3100
2180
+ },
2181
+ {
2182
+ "epoch": 0.9952,
2183
+ "grad_norm": 2.6420481204986572,
2184
+ "learning_rate": 5.053908355795148e-07,
2185
+ "loss": 0.1223,
2186
+ "step": 3110
2187
+ },
2188
+ {
2189
+ "epoch": 0.9984,
2190
+ "grad_norm": 1.7279895544052124,
2191
+ "learning_rate": 1.684636118598383e-07,
2192
+ "loss": 0.1092,
2193
+ "step": 3120
2194
+ }
2195
+ ],
2196
+ "logging_steps": 10,
2197
+ "max_steps": 3125,
2198
+ "num_input_tokens_seen": 0,
2199
+ "num_train_epochs": 1,
2200
+ "save_steps": 500,
2201
+ "stateful_callbacks": {
2202
+ "TrainerControl": {
2203
+ "args": {
2204
+ "should_epoch_stop": false,
2205
+ "should_evaluate": false,
2206
+ "should_log": false,
2207
+ "should_save": true,
2208
+ "should_training_stop": true
2209
+ },
2210
+ "attributes": {}
2211
+ }
2212
+ },
2213
+ "total_flos": 0.0,
2214
+ "train_batch_size": 32,
2215
+ "trial_name": null,
2216
+ "trial_params": null
2217
+ }
checkpoint-3125/training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:34b25fd9c89f4f3be8a73938f7135204ea978324b0a810d63ac2bad185610a04
3
+ size 5304
checkpoint-3125/vocab.json ADDED
The diff for this file is too large to render. See raw diff
 
config.json ADDED
@@ -0,0 +1,40 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "gpt2-medium",
3
+ "activation_function": "gelu_new",
4
+ "architectures": [
5
+ "GPT2LMHeadModel"
6
+ ],
7
+ "attn_pdrop": 0.1,
8
+ "bos_token_id": 50256,
9
+ "embd_pdrop": 0.1,
10
+ "eos_token_id": 50256,
11
+ "initializer_range": 0.02,
12
+ "layer_norm_epsilon": 1e-05,
13
+ "model_type": "gpt2",
14
+ "n_ctx": 1024,
15
+ "n_embd": 1024,
16
+ "n_head": 16,
17
+ "n_inner": null,
18
+ "n_layer": 24,
19
+ "n_positions": 1024,
20
+ "n_special": 0,
21
+ "predict_special_tokens": true,
22
+ "reorder_and_upcast_attn": false,
23
+ "resid_pdrop": 0.1,
24
+ "scale_attn_by_inverse_layer_idx": false,
25
+ "scale_attn_weights": true,
26
+ "summary_activation": null,
27
+ "summary_first_dropout": 0.1,
28
+ "summary_proj_to_labels": true,
29
+ "summary_type": "cls_index",
30
+ "summary_use_proj": true,
31
+ "task_specific_params": {
32
+ "text-generation": {
33
+ "do_sample": true,
34
+ "max_length": 50
35
+ }
36
+ },
37
+ "transformers_version": "4.45.2",
38
+ "use_cache": true,
39
+ "vocab_size": 50257
40
+ }
merges.txt ADDED
The diff for this file is too large to render. See raw diff
 
special_tokens_map.json ADDED
@@ -0,0 +1,24 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "bos_token": {
3
+ "content": "<|endoftext|>",
4
+ "lstrip": false,
5
+ "normalized": true,
6
+ "rstrip": false,
7
+ "single_word": false
8
+ },
9
+ "eos_token": {
10
+ "content": "<|endoftext|>",
11
+ "lstrip": false,
12
+ "normalized": true,
13
+ "rstrip": false,
14
+ "single_word": false
15
+ },
16
+ "pad_token": "<|endoftext|>",
17
+ "unk_token": {
18
+ "content": "<|endoftext|>",
19
+ "lstrip": false,
20
+ "normalized": true,
21
+ "rstrip": false,
22
+ "single_word": false
23
+ }
24
+ }
tokenizer_config.json ADDED
@@ -0,0 +1,22 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "add_bos_token": false,
3
+ "add_prefix_space": false,
4
+ "added_tokens_decoder": {
5
+ "50256": {
6
+ "content": "<|endoftext|>",
7
+ "lstrip": false,
8
+ "normalized": true,
9
+ "rstrip": false,
10
+ "single_word": false,
11
+ "special": true
12
+ }
13
+ },
14
+ "bos_token": "<|endoftext|>",
15
+ "clean_up_tokenization_spaces": false,
16
+ "eos_token": "<|endoftext|>",
17
+ "errors": "replace",
18
+ "model_max_length": 1024,
19
+ "pad_token": "<|endoftext|>",
20
+ "tokenizer_class": "GPT2Tokenizer",
21
+ "unk_token": "<|endoftext|>"
22
+ }
training_logs.csv ADDED
@@ -0,0 +1,314 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ step,training_loss,grad_norm
2
+ 10,24.9631,10.0
3
+ 20,20.8416,10.0
4
+ 30,18.8244,9.999999046325684
5
+ 40,20.6664,10.0
6
+ 50,17.5711,9.999999046325684
7
+ 60,15.9775,10.0
8
+ 70,15.0004,10.0
9
+ 80,13.074,9.999999046325684
10
+ 90,10.9575,9.999998092651367
11
+ 100,8.7852,9.999998092651367
12
+ 110,8.9422,9.999998092651367
13
+ 120,6.8078,10.0
14
+ 130,6.0916,10.0
15
+ 140,5.8139,9.999999046325684
16
+ 150,5.7631,9.999999046325684
17
+ 160,4.6852,9.999998092651367
18
+ 170,4.6054,9.999999046325684
19
+ 180,4.0257,9.999999046325684
20
+ 190,4.1777,10.0
21
+ 200,4.0922,9.999999046325684
22
+ 210,3.5944,9.999998092651367
23
+ 220,3.5933,9.999999046325684
24
+ 230,3.684,10.0
25
+ 240,3.1428,9.999999046325684
26
+ 250,2.9727,9.999999046325684
27
+ 260,3.2243,9.999999046325684
28
+ 270,2.9269,9.463090896606445
29
+ 280,2.955,9.999999046325684
30
+ 290,2.5544,9.999999046325684
31
+ 300,2.7417,10.0
32
+ 310,2.569,9.999998092651367
33
+ 320,2.3499,9.225275039672852
34
+ 330,2.6455,9.999998092651367
35
+ 340,2.3627,9.915996551513672
36
+ 350,2.3267,9.526017189025879
37
+ 360,2.1942,9.999999046325684
38
+ 370,2.2241,10.0
39
+ 380,2.1867,9.999998092651367
40
+ 390,2.1133,6.984477996826172
41
+ 400,2.1515,9.999999046325684
42
+ 410,2.191,9.388639450073242
43
+ 420,2.05,6.955385684967041
44
+ 430,1.9369,8.776995658874512
45
+ 440,1.8571,9.813057899475098
46
+ 450,1.8026,9.999998092651367
47
+ 460,1.6484,9.999999046325684
48
+ 470,1.601,6.5526018142700195
49
+ 480,1.5064,9.999999046325684
50
+ 490,1.6099,7.3129496574401855
51
+ 500,1.5446,8.732329368591309
52
+ 510,1.3392,6.793776512145996
53
+ 520,1.3408,8.43354606628418
54
+ 530,1.3075,7.959011077880859
55
+ 540,1.4838,10.0
56
+ 550,1.3092,8.968912124633789
57
+ 560,1.3005,10.0
58
+ 570,1.1666,9.999999046325684
59
+ 580,1.292,9.999999046325684
60
+ 590,1.2324,9.630219459533691
61
+ 600,1.2701,9.999999046325684
62
+ 610,1.1545,10.0
63
+ 620,1.0769,8.581560134887695
64
+ 630,1.1256,9.464119911193848
65
+ 640,1.0227,6.58554744720459
66
+ 650,1.077,6.532876491546631
67
+ 660,1.0998,7.9927263259887695
68
+ 670,0.9585,5.303004264831543
69
+ 680,0.8816,6.559380531311035
70
+ 690,1.1047,7.447219371795654
71
+ 700,0.8914,8.270160675048828
72
+ 710,0.9932,9.999999046325684
73
+ 720,0.9273,8.01986312866211
74
+ 730,1.0231,5.2210917472839355
75
+ 740,0.8625,5.701915740966797
76
+ 750,0.9228,8.181998252868652
77
+ 760,0.8675,6.676185131072998
78
+ 770,0.9035,6.564544677734375
79
+ 780,0.8019,9.999998092651367
80
+ 790,0.8066,9.999999046325684
81
+ 800,0.8352,10.0
82
+ 810,0.8574,7.660287857055664
83
+ 820,0.8104,4.782571792602539
84
+ 830,0.8074,7.636229991912842
85
+ 840,0.7614,5.22883939743042
86
+ 850,0.7084,6.009452819824219
87
+ 860,0.7718,7.293655872344971
88
+ 870,0.7292,5.065558433532715
89
+ 880,0.7474,7.277278423309326
90
+ 890,0.7241,9.6370210647583
91
+ 900,0.6728,7.019769191741943
92
+ 910,0.7054,5.834097862243652
93
+ 920,0.7325,9.999998092651367
94
+ 930,0.7168,8.013876914978027
95
+ 940,0.5667,5.732053756713867
96
+ 950,0.6504,4.955954551696777
97
+ 960,0.5968,4.867282390594482
98
+ 970,0.6032,6.085322856903076
99
+ 980,0.6642,6.458901882171631
100
+ 990,0.6572,8.118791580200195
101
+ 1000,0.6236,6.958062171936035
102
+ 1010,0.5852,4.629474639892578
103
+ 1020,0.6545,5.775842189788818
104
+ 1030,0.5677,5.303742408752441
105
+ 1040,0.5463,6.655779838562012
106
+ 1050,0.5669,6.062790870666504
107
+ 1060,0.5475,4.374266147613525
108
+ 1070,0.5624,4.560005187988281
109
+ 1080,0.5455,4.675742149353027
110
+ 1090,0.5485,5.191764831542969
111
+ 1100,0.5439,6.050549030303955
112
+ 1110,0.4899,7.021573543548584
113
+ 1120,0.499,5.938823699951172
114
+ 1130,0.4657,6.675471305847168
115
+ 1140,0.4819,4.019260883331299
116
+ 1150,0.4471,6.338311195373535
117
+ 1160,0.4334,5.028086185455322
118
+ 1170,0.4569,3.6760733127593994
119
+ 1180,0.4314,5.685121059417725
120
+ 1190,0.4527,4.894414901733398
121
+ 1200,0.3814,6.624325275421143
122
+ 1210,0.4354,7.109598636627197
123
+ 1220,0.4103,5.322418689727783
124
+ 1230,0.414,4.165887832641602
125
+ 1240,0.4091,2.6662333011627197
126
+ 1250,0.434,4.60746431350708
127
+ 1260,0.3858,4.521622180938721
128
+ 1270,0.3935,3.3457305431365967
129
+ 1280,0.4036,3.885206699371338
130
+ 1290,0.3972,3.958425998687744
131
+ 1300,0.359,4.6747846603393555
132
+ 1310,0.3843,3.6438868045806885
133
+ 1320,0.3952,4.175204753875732
134
+ 1330,0.363,2.771331548690796
135
+ 1340,0.3401,3.504378080368042
136
+ 1350,0.335,4.495083808898926
137
+ 1360,0.3138,3.8098154067993164
138
+ 1370,0.3234,4.181695938110352
139
+ 1380,0.2771,3.0462260246276855
140
+ 1390,0.3024,3.0168416500091553
141
+ 1400,0.307,4.196632385253906
142
+ 1410,0.3261,3.4224884510040283
143
+ 1420,0.3116,3.340468645095825
144
+ 1430,0.3037,3.3401427268981934
145
+ 1440,0.2975,3.3940038681030273
146
+ 1450,0.288,2.9820821285247803
147
+ 1460,0.2998,3.953834056854248
148
+ 1470,0.3029,3.664151906967163
149
+ 1480,0.308,5.874343395233154
150
+ 1490,0.293,5.167667388916016
151
+ 1500,0.2736,2.0990707874298096
152
+ 1510,0.2802,3.07232403755188
153
+ 1520,0.2773,3.340930461883545
154
+ 1530,0.2645,2.88307523727417
155
+ 1540,0.2864,3.0353357791900635
156
+ 1550,0.2674,2.4584333896636963
157
+ 1560,0.2425,2.793869972229004
158
+ 1570,0.245,2.5717570781707764
159
+ 1580,0.2553,2.962009906768799
160
+ 1590,0.2338,2.053489923477173
161
+ 1600,0.2349,3.5270988941192627
162
+ 1610,0.2475,3.227064847946167
163
+ 1620,0.2264,3.002271890640259
164
+ 1630,0.2473,2.5545215606689453
165
+ 1640,0.228,3.1033124923706055
166
+ 1650,0.2337,2.90221905708313
167
+ 1660,0.2322,3.5613458156585693
168
+ 1670,0.2153,2.788691282272339
169
+ 1680,0.2249,2.7116010189056396
170
+ 1690,0.2265,2.3744070529937744
171
+ 1700,0.2132,2.920132637023926
172
+ 1710,0.2061,2.0210254192352295
173
+ 1720,0.1875,2.7059290409088135
174
+ 1730,0.2012,2.262721061706543
175
+ 1740,0.2325,3.484440326690674
176
+ 1750,0.2051,2.090310573577881
177
+ 1760,0.1883,2.2044506072998047
178
+ 1770,0.2145,2.702648878097534
179
+ 1780,0.2058,9.999999046325684
180
+ 1790,0.2181,5.150343894958496
181
+ 1800,0.1986,1.4523239135742188
182
+ 1810,0.1908,2.1557908058166504
183
+ 1820,0.185,2.1338348388671875
184
+ 1830,0.1899,2.266662836074829
185
+ 1840,0.1799,1.9031009674072266
186
+ 1850,0.1965,1.731602668762207
187
+ 1860,0.1998,1.3521720170974731
188
+ 1870,0.1873,2.028102397918701
189
+ 1880,0.1821,3.0926480293273926
190
+ 1890,0.1737,2.1551952362060547
191
+ 1900,0.1898,2.661207437515259
192
+ 1910,0.1902,2.934997081756592
193
+ 1920,0.1744,1.7465407848358154
194
+ 1930,0.1749,2.341937780380249
195
+ 1940,0.164,2.418426513671875
196
+ 1950,0.1682,2.408447265625
197
+ 1960,0.1676,1.976258397102356
198
+ 1970,0.1572,2.601335287094116
199
+ 1980,0.1545,2.3064210414886475
200
+ 1990,0.1657,2.60603666305542
201
+ 2000,0.165,2.5705275535583496
202
+ 2010,0.1705,3.4823734760284424
203
+ 2020,0.1502,1.5383442640304565
204
+ 2030,0.1598,2.4880733489990234
205
+ 2040,0.1582,2.6558523178100586
206
+ 2050,0.1596,2.7092580795288086
207
+ 2060,0.1505,2.1501896381378174
208
+ 2070,0.1498,1.9364612102508545
209
+ 2080,0.1561,1.987347960472107
210
+ 2090,0.153,2.656722068786621
211
+ 2100,0.162,2.4392051696777344
212
+ 2110,0.1627,2.220731019973755
213
+ 2120,0.1652,1.5795049667358398
214
+ 2130,0.1606,2.1749913692474365
215
+ 2140,0.1461,2.301445960998535
216
+ 2150,0.1498,2.8436062335968018
217
+ 2160,0.1543,1.9361361265182495
218
+ 2170,0.1402,1.9528151750564575
219
+ 2180,0.1329,2.5773980617523193
220
+ 2190,0.1574,2.697664976119995
221
+ 2200,0.1513,2.53369402885437
222
+ 2210,0.1463,2.3387765884399414
223
+ 2220,0.1373,1.6529033184051514
224
+ 2230,0.1325,2.692295551300049
225
+ 2240,0.1484,2.09934401512146
226
+ 2250,0.1341,2.7355453968048096
227
+ 2260,0.1475,1.8107614517211914
228
+ 2270,0.1438,1.958337426185608
229
+ 2280,0.1395,1.456838607788086
230
+ 2290,0.1329,1.997288703918457
231
+ 2300,0.1357,2.1655972003936768
232
+ 2310,0.1444,1.8002907037734985
233
+ 2320,0.1455,2.862435817718506
234
+ 2330,0.1455,1.9556299448013306
235
+ 2340,0.1359,2.489349365234375
236
+ 2350,0.1448,2.1165363788604736
237
+ 2360,0.1341,1.813704013824463
238
+ 2370,0.1255,2.7160651683807373
239
+ 2380,0.1334,1.3045297861099243
240
+ 2390,0.1538,2.3194830417633057
241
+ 2400,0.1407,1.9577467441558838
242
+ 2410,0.1273,1.4686191082000732
243
+ 2420,0.1386,2.1956889629364014
244
+ 2430,0.1298,1.6341290473937988
245
+ 2440,0.1283,2.429835081100464
246
+ 2450,0.2458,9.999998092651367
247
+ 2460,0.1338,1.7494025230407715
248
+ 2470,0.1208,2.2596070766448975
249
+ 2480,0.1403,2.0361554622650146
250
+ 2490,0.1222,2.3333733081817627
251
+ 2500,0.1317,1.4406805038452148
252
+ 2510,0.129,1.7497354745864868
253
+ 2520,0.1305,1.8802416324615479
254
+ 2530,0.1266,1.733115792274475
255
+ 2540,0.1336,2.586423397064209
256
+ 2550,0.134,1.6050171852111816
257
+ 2560,0.1252,1.717938780784607
258
+ 2570,0.1217,1.871017575263977
259
+ 2580,0.1235,2.6902308464050293
260
+ 2590,0.1303,1.9302878379821777
261
+ 2600,0.1176,2.0468838214874268
262
+ 2610,0.1163,1.396986722946167
263
+ 2620,0.118,1.742895245552063
264
+ 2630,0.1229,1.870067834854126
265
+ 2640,0.1384,1.7094252109527588
266
+ 2650,0.1107,2.3981940746307373
267
+ 2660,0.1152,2.565001964569092
268
+ 2670,0.1227,2.4875590801239014
269
+ 2680,0.1172,2.2726962566375732
270
+ 2690,0.1124,1.7912529706954956
271
+ 2700,0.1215,1.7430084943771362
272
+ 2710,0.1185,2.2106008529663086
273
+ 2720,0.1241,2.9677484035491943
274
+ 2730,0.1173,1.9449735879898071
275
+ 2740,0.1253,2.0924923419952393
276
+ 2750,0.109,2.4807486534118652
277
+ 2760,0.1183,2.2792012691497803
278
+ 2770,0.1028,1.7402423620224
279
+ 2780,0.109,1.4183549880981445
280
+ 2790,0.1084,2.241910934448242
281
+ 2800,0.1181,1.7856048345565796
282
+ 2810,0.1205,1.8856292963027954
283
+ 2820,0.1255,2.5832862854003906
284
+ 2830,0.1186,1.7440129518508911
285
+ 2840,0.1159,1.7217051982879639
286
+ 2850,0.1112,2.0266754627227783
287
+ 2860,0.1225,1.816345453262329
288
+ 2870,0.1215,1.9036134481430054
289
+ 2880,0.1015,1.8858239650726318
290
+ 2890,0.1115,1.9040420055389404
291
+ 2900,0.1208,2.292043447494507
292
+ 2910,0.1041,2.022998809814453
293
+ 2920,0.1115,2.045623779296875
294
+ 2930,0.112,1.6464377641677856
295
+ 2940,0.1081,1.6890789270401
296
+ 2950,0.1147,2.4978396892547607
297
+ 2960,0.1019,1.9748504161834717
298
+ 2970,0.1114,2.1371240615844727
299
+ 2980,0.1142,2.8856780529022217
300
+ 2990,0.1004,1.8940197229385376
301
+ 3000,0.1063,1.2457224130630493
302
+ 3010,0.1196,3.265352725982666
303
+ 3020,0.1153,2.0506467819213867
304
+ 3030,0.1068,1.4757201671600342
305
+ 3040,0.1119,2.700068473815918
306
+ 3050,0.1051,2.0698773860931396
307
+ 3060,0.1101,1.6540876626968384
308
+ 3070,0.1118,1.8899681568145752
309
+ 3080,0.1087,1.6223585605621338
310
+ 3090,0.1058,1.9171777963638306
311
+ 3100,0.1044,1.4021532535552979
312
+ 3110,0.1223,2.6420481204986572
313
+ 3120,0.1092,1.7279895544052124
314
+ 3125,nan,nan
vocab.json ADDED
The diff for this file is too large to render. See raw diff