maanasharma5 commited on
Commit
b013d65
·
verified ·
1 Parent(s): 256696d

Upload folder using huggingface_hub

Browse files
Files changed (45) hide show
  1. README.md +202 -0
  2. adapter_config.json +31 -0
  3. adapter_model.safetensors +3 -0
  4. checkpoint-3125/README.md +202 -0
  5. checkpoint-3125/adapter_config.json +31 -0
  6. checkpoint-3125/adapter_model.safetensors +3 -0
  7. checkpoint-3125/merges.txt +0 -0
  8. checkpoint-3125/optimizer.pt +3 -0
  9. checkpoint-3125/rng_state.pth +3 -0
  10. checkpoint-3125/scheduler.pt +3 -0
  11. checkpoint-3125/special_tokens_map.json +24 -0
  12. checkpoint-3125/tokenizer_config.json +22 -0
  13. checkpoint-3125/trainer_state.json +0 -0
  14. checkpoint-3125/training_args.bin +3 -0
  15. checkpoint-3125/vocab.json +0 -0
  16. checkpoint-6250/README.md +202 -0
  17. checkpoint-6250/adapter_config.json +31 -0
  18. checkpoint-6250/adapter_model.safetensors +3 -0
  19. checkpoint-6250/merges.txt +0 -0
  20. checkpoint-6250/optimizer.pt +3 -0
  21. checkpoint-6250/rng_state.pth +3 -0
  22. checkpoint-6250/scheduler.pt +3 -0
  23. checkpoint-6250/special_tokens_map.json +24 -0
  24. checkpoint-6250/tokenizer_config.json +22 -0
  25. checkpoint-6250/trainer_state.json +0 -0
  26. checkpoint-6250/training_args.bin +3 -0
  27. checkpoint-6250/vocab.json +0 -0
  28. checkpoint-9375/README.md +202 -0
  29. checkpoint-9375/adapter_config.json +31 -0
  30. checkpoint-9375/adapter_model.safetensors +3 -0
  31. checkpoint-9375/merges.txt +0 -0
  32. checkpoint-9375/optimizer.pt +3 -0
  33. checkpoint-9375/rng_state.pth +3 -0
  34. checkpoint-9375/scheduler.pt +3 -0
  35. checkpoint-9375/special_tokens_map.json +24 -0
  36. checkpoint-9375/tokenizer_config.json +22 -0
  37. checkpoint-9375/trainer_state.json +0 -0
  38. checkpoint-9375/training_args.bin +3 -0
  39. checkpoint-9375/vocab.json +0 -0
  40. config.json +40 -0
  41. merges.txt +0 -0
  42. special_tokens_map.json +24 -0
  43. tokenizer_config.json +22 -0
  44. training_logs.csv +939 -0
  45. vocab.json +0 -0
README.md ADDED
@@ -0,0 +1,202 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ base_model: gpt2-medium
3
+ library_name: peft
4
+ ---
5
+
6
+ # Model Card for Model ID
7
+
8
+ <!-- Provide a quick summary of what the model is/does. -->
9
+
10
+
11
+
12
+ ## Model Details
13
+
14
+ ### Model Description
15
+
16
+ <!-- Provide a longer summary of what this model is. -->
17
+
18
+
19
+
20
+ - **Developed by:** [More Information Needed]
21
+ - **Funded by [optional]:** [More Information Needed]
22
+ - **Shared by [optional]:** [More Information Needed]
23
+ - **Model type:** [More Information Needed]
24
+ - **Language(s) (NLP):** [More Information Needed]
25
+ - **License:** [More Information Needed]
26
+ - **Finetuned from model [optional]:** [More Information Needed]
27
+
28
+ ### Model Sources [optional]
29
+
30
+ <!-- Provide the basic links for the model. -->
31
+
32
+ - **Repository:** [More Information Needed]
33
+ - **Paper [optional]:** [More Information Needed]
34
+ - **Demo [optional]:** [More Information Needed]
35
+
36
+ ## Uses
37
+
38
+ <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
39
+
40
+ ### Direct Use
41
+
42
+ <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
43
+
44
+ [More Information Needed]
45
+
46
+ ### Downstream Use [optional]
47
+
48
+ <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
49
+
50
+ [More Information Needed]
51
+
52
+ ### Out-of-Scope Use
53
+
54
+ <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
55
+
56
+ [More Information Needed]
57
+
58
+ ## Bias, Risks, and Limitations
59
+
60
+ <!-- This section is meant to convey both technical and sociotechnical limitations. -->
61
+
62
+ [More Information Needed]
63
+
64
+ ### Recommendations
65
+
66
+ <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
67
+
68
+ Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
69
+
70
+ ## How to Get Started with the Model
71
+
72
+ Use the code below to get started with the model.
73
+
74
+ [More Information Needed]
75
+
76
+ ## Training Details
77
+
78
+ ### Training Data
79
+
80
+ <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
81
+
82
+ [More Information Needed]
83
+
84
+ ### Training Procedure
85
+
86
+ <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
87
+
88
+ #### Preprocessing [optional]
89
+
90
+ [More Information Needed]
91
+
92
+
93
+ #### Training Hyperparameters
94
+
95
+ - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
96
+
97
+ #### Speeds, Sizes, Times [optional]
98
+
99
+ <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
100
+
101
+ [More Information Needed]
102
+
103
+ ## Evaluation
104
+
105
+ <!-- This section describes the evaluation protocols and provides the results. -->
106
+
107
+ ### Testing Data, Factors & Metrics
108
+
109
+ #### Testing Data
110
+
111
+ <!-- This should link to a Dataset Card if possible. -->
112
+
113
+ [More Information Needed]
114
+
115
+ #### Factors
116
+
117
+ <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
118
+
119
+ [More Information Needed]
120
+
121
+ #### Metrics
122
+
123
+ <!-- These are the evaluation metrics being used, ideally with a description of why. -->
124
+
125
+ [More Information Needed]
126
+
127
+ ### Results
128
+
129
+ [More Information Needed]
130
+
131
+ #### Summary
132
+
133
+
134
+
135
+ ## Model Examination [optional]
136
+
137
+ <!-- Relevant interpretability work for the model goes here -->
138
+
139
+ [More Information Needed]
140
+
141
+ ## Environmental Impact
142
+
143
+ <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
144
+
145
+ Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
146
+
147
+ - **Hardware Type:** [More Information Needed]
148
+ - **Hours used:** [More Information Needed]
149
+ - **Cloud Provider:** [More Information Needed]
150
+ - **Compute Region:** [More Information Needed]
151
+ - **Carbon Emitted:** [More Information Needed]
152
+
153
+ ## Technical Specifications [optional]
154
+
155
+ ### Model Architecture and Objective
156
+
157
+ [More Information Needed]
158
+
159
+ ### Compute Infrastructure
160
+
161
+ [More Information Needed]
162
+
163
+ #### Hardware
164
+
165
+ [More Information Needed]
166
+
167
+ #### Software
168
+
169
+ [More Information Needed]
170
+
171
+ ## Citation [optional]
172
+
173
+ <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
174
+
175
+ **BibTeX:**
176
+
177
+ [More Information Needed]
178
+
179
+ **APA:**
180
+
181
+ [More Information Needed]
182
+
183
+ ## Glossary [optional]
184
+
185
+ <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
186
+
187
+ [More Information Needed]
188
+
189
+ ## More Information [optional]
190
+
191
+ [More Information Needed]
192
+
193
+ ## Model Card Authors [optional]
194
+
195
+ [More Information Needed]
196
+
197
+ ## Model Card Contact
198
+
199
+ [More Information Needed]
200
+ ### Framework versions
201
+
202
+ - PEFT 0.13.2
adapter_config.json ADDED
@@ -0,0 +1,31 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "alpha_pattern": {},
3
+ "auto_mapping": {
4
+ "base_model_class": "GPT2LMHeadModel",
5
+ "parent_library": "transformers.models.gpt2.modeling_gpt2"
6
+ },
7
+ "base_model_name_or_path": "gpt2-medium",
8
+ "bias": "none",
9
+ "fan_in_fan_out": true,
10
+ "inference_mode": true,
11
+ "init_lora_weights": true,
12
+ "layer_replication": null,
13
+ "layers_pattern": null,
14
+ "layers_to_transform": null,
15
+ "loftq_config": {},
16
+ "lora_alpha": 128,
17
+ "lora_dropout": 0.0,
18
+ "megatron_config": null,
19
+ "megatron_core": "megatron.core",
20
+ "modules_to_save": null,
21
+ "peft_type": "LORA",
22
+ "r": 64,
23
+ "rank_pattern": {},
24
+ "revision": null,
25
+ "target_modules": [
26
+ "c_attn"
27
+ ],
28
+ "task_type": null,
29
+ "use_dora": false,
30
+ "use_rslora": false
31
+ }
adapter_model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:4203cb437ba2cf65b816c0da7440c59049742057c90e0fe86565cb3b31de1c7c
3
+ size 25172088
checkpoint-3125/README.md ADDED
@@ -0,0 +1,202 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ base_model: gpt2-medium
3
+ library_name: peft
4
+ ---
5
+
6
+ # Model Card for Model ID
7
+
8
+ <!-- Provide a quick summary of what the model is/does. -->
9
+
10
+
11
+
12
+ ## Model Details
13
+
14
+ ### Model Description
15
+
16
+ <!-- Provide a longer summary of what this model is. -->
17
+
18
+
19
+
20
+ - **Developed by:** [More Information Needed]
21
+ - **Funded by [optional]:** [More Information Needed]
22
+ - **Shared by [optional]:** [More Information Needed]
23
+ - **Model type:** [More Information Needed]
24
+ - **Language(s) (NLP):** [More Information Needed]
25
+ - **License:** [More Information Needed]
26
+ - **Finetuned from model [optional]:** [More Information Needed]
27
+
28
+ ### Model Sources [optional]
29
+
30
+ <!-- Provide the basic links for the model. -->
31
+
32
+ - **Repository:** [More Information Needed]
33
+ - **Paper [optional]:** [More Information Needed]
34
+ - **Demo [optional]:** [More Information Needed]
35
+
36
+ ## Uses
37
+
38
+ <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
39
+
40
+ ### Direct Use
41
+
42
+ <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
43
+
44
+ [More Information Needed]
45
+
46
+ ### Downstream Use [optional]
47
+
48
+ <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
49
+
50
+ [More Information Needed]
51
+
52
+ ### Out-of-Scope Use
53
+
54
+ <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
55
+
56
+ [More Information Needed]
57
+
58
+ ## Bias, Risks, and Limitations
59
+
60
+ <!-- This section is meant to convey both technical and sociotechnical limitations. -->
61
+
62
+ [More Information Needed]
63
+
64
+ ### Recommendations
65
+
66
+ <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
67
+
68
+ Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
69
+
70
+ ## How to Get Started with the Model
71
+
72
+ Use the code below to get started with the model.
73
+
74
+ [More Information Needed]
75
+
76
+ ## Training Details
77
+
78
+ ### Training Data
79
+
80
+ <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
81
+
82
+ [More Information Needed]
83
+
84
+ ### Training Procedure
85
+
86
+ <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
87
+
88
+ #### Preprocessing [optional]
89
+
90
+ [More Information Needed]
91
+
92
+
93
+ #### Training Hyperparameters
94
+
95
+ - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
96
+
97
+ #### Speeds, Sizes, Times [optional]
98
+
99
+ <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
100
+
101
+ [More Information Needed]
102
+
103
+ ## Evaluation
104
+
105
+ <!-- This section describes the evaluation protocols and provides the results. -->
106
+
107
+ ### Testing Data, Factors & Metrics
108
+
109
+ #### Testing Data
110
+
111
+ <!-- This should link to a Dataset Card if possible. -->
112
+
113
+ [More Information Needed]
114
+
115
+ #### Factors
116
+
117
+ <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
118
+
119
+ [More Information Needed]
120
+
121
+ #### Metrics
122
+
123
+ <!-- These are the evaluation metrics being used, ideally with a description of why. -->
124
+
125
+ [More Information Needed]
126
+
127
+ ### Results
128
+
129
+ [More Information Needed]
130
+
131
+ #### Summary
132
+
133
+
134
+
135
+ ## Model Examination [optional]
136
+
137
+ <!-- Relevant interpretability work for the model goes here -->
138
+
139
+ [More Information Needed]
140
+
141
+ ## Environmental Impact
142
+
143
+ <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
144
+
145
+ Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
146
+
147
+ - **Hardware Type:** [More Information Needed]
148
+ - **Hours used:** [More Information Needed]
149
+ - **Cloud Provider:** [More Information Needed]
150
+ - **Compute Region:** [More Information Needed]
151
+ - **Carbon Emitted:** [More Information Needed]
152
+
153
+ ## Technical Specifications [optional]
154
+
155
+ ### Model Architecture and Objective
156
+
157
+ [More Information Needed]
158
+
159
+ ### Compute Infrastructure
160
+
161
+ [More Information Needed]
162
+
163
+ #### Hardware
164
+
165
+ [More Information Needed]
166
+
167
+ #### Software
168
+
169
+ [More Information Needed]
170
+
171
+ ## Citation [optional]
172
+
173
+ <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
174
+
175
+ **BibTeX:**
176
+
177
+ [More Information Needed]
178
+
179
+ **APA:**
180
+
181
+ [More Information Needed]
182
+
183
+ ## Glossary [optional]
184
+
185
+ <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
186
+
187
+ [More Information Needed]
188
+
189
+ ## More Information [optional]
190
+
191
+ [More Information Needed]
192
+
193
+ ## Model Card Authors [optional]
194
+
195
+ [More Information Needed]
196
+
197
+ ## Model Card Contact
198
+
199
+ [More Information Needed]
200
+ ### Framework versions
201
+
202
+ - PEFT 0.13.2
checkpoint-3125/adapter_config.json ADDED
@@ -0,0 +1,31 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "alpha_pattern": {},
3
+ "auto_mapping": {
4
+ "base_model_class": "GPT2LMHeadModel",
5
+ "parent_library": "transformers.models.gpt2.modeling_gpt2"
6
+ },
7
+ "base_model_name_or_path": "gpt2-medium",
8
+ "bias": "none",
9
+ "fan_in_fan_out": true,
10
+ "inference_mode": true,
11
+ "init_lora_weights": true,
12
+ "layer_replication": null,
13
+ "layers_pattern": null,
14
+ "layers_to_transform": null,
15
+ "loftq_config": {},
16
+ "lora_alpha": 128,
17
+ "lora_dropout": 0.0,
18
+ "megatron_config": null,
19
+ "megatron_core": "megatron.core",
20
+ "modules_to_save": null,
21
+ "peft_type": "LORA",
22
+ "r": 64,
23
+ "rank_pattern": {},
24
+ "revision": null,
25
+ "target_modules": [
26
+ "c_attn"
27
+ ],
28
+ "task_type": null,
29
+ "use_dora": false,
30
+ "use_rslora": false
31
+ }
checkpoint-3125/adapter_model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:d4a164f714ae15594234d9c035160a7988916f46999d8d42f982c5ad4826dda9
3
+ size 25172088
checkpoint-3125/merges.txt ADDED
The diff for this file is too large to render. See raw diff
 
checkpoint-3125/optimizer.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:824b32a29a8c318cbce757f2fc69edb44d20b9c11d3941f0fc866fb72ab0bc8b
3
+ size 50372538
checkpoint-3125/rng_state.pth ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:234b996b070281d39ab4e6086f2c1029d569ac8ba3735d66b9eefa71b682392d
3
+ size 14244
checkpoint-3125/scheduler.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:392855cc9cbe029377262097ef598767921e2a3bc6937822c989a7603ee182c3
3
+ size 1064
checkpoint-3125/special_tokens_map.json ADDED
@@ -0,0 +1,24 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "bos_token": {
3
+ "content": "<|endoftext|>",
4
+ "lstrip": false,
5
+ "normalized": true,
6
+ "rstrip": false,
7
+ "single_word": false
8
+ },
9
+ "eos_token": {
10
+ "content": "<|endoftext|>",
11
+ "lstrip": false,
12
+ "normalized": true,
13
+ "rstrip": false,
14
+ "single_word": false
15
+ },
16
+ "pad_token": "<|endoftext|>",
17
+ "unk_token": {
18
+ "content": "<|endoftext|>",
19
+ "lstrip": false,
20
+ "normalized": true,
21
+ "rstrip": false,
22
+ "single_word": false
23
+ }
24
+ }
checkpoint-3125/tokenizer_config.json ADDED
@@ -0,0 +1,22 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "add_bos_token": false,
3
+ "add_prefix_space": false,
4
+ "added_tokens_decoder": {
5
+ "50256": {
6
+ "content": "<|endoftext|>",
7
+ "lstrip": false,
8
+ "normalized": true,
9
+ "rstrip": false,
10
+ "single_word": false,
11
+ "special": true
12
+ }
13
+ },
14
+ "bos_token": "<|endoftext|>",
15
+ "clean_up_tokenization_spaces": false,
16
+ "eos_token": "<|endoftext|>",
17
+ "errors": "replace",
18
+ "model_max_length": 1024,
19
+ "pad_token": "<|endoftext|>",
20
+ "tokenizer_class": "GPT2Tokenizer",
21
+ "unk_token": "<|endoftext|>"
22
+ }
checkpoint-3125/trainer_state.json ADDED
The diff for this file is too large to render. See raw diff
 
checkpoint-3125/training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:cfade90fc53f4fc9f6f077f18cfaab66068bae3b4a1c448a94e0ef7e54a4479f
3
+ size 5304
checkpoint-3125/vocab.json ADDED
The diff for this file is too large to render. See raw diff
 
checkpoint-6250/README.md ADDED
@@ -0,0 +1,202 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ base_model: gpt2-medium
3
+ library_name: peft
4
+ ---
5
+
6
+ # Model Card for Model ID
7
+
8
+ <!-- Provide a quick summary of what the model is/does. -->
9
+
10
+
11
+
12
+ ## Model Details
13
+
14
+ ### Model Description
15
+
16
+ <!-- Provide a longer summary of what this model is. -->
17
+
18
+
19
+
20
+ - **Developed by:** [More Information Needed]
21
+ - **Funded by [optional]:** [More Information Needed]
22
+ - **Shared by [optional]:** [More Information Needed]
23
+ - **Model type:** [More Information Needed]
24
+ - **Language(s) (NLP):** [More Information Needed]
25
+ - **License:** [More Information Needed]
26
+ - **Finetuned from model [optional]:** [More Information Needed]
27
+
28
+ ### Model Sources [optional]
29
+
30
+ <!-- Provide the basic links for the model. -->
31
+
32
+ - **Repository:** [More Information Needed]
33
+ - **Paper [optional]:** [More Information Needed]
34
+ - **Demo [optional]:** [More Information Needed]
35
+
36
+ ## Uses
37
+
38
+ <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
39
+
40
+ ### Direct Use
41
+
42
+ <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
43
+
44
+ [More Information Needed]
45
+
46
+ ### Downstream Use [optional]
47
+
48
+ <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
49
+
50
+ [More Information Needed]
51
+
52
+ ### Out-of-Scope Use
53
+
54
+ <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
55
+
56
+ [More Information Needed]
57
+
58
+ ## Bias, Risks, and Limitations
59
+
60
+ <!-- This section is meant to convey both technical and sociotechnical limitations. -->
61
+
62
+ [More Information Needed]
63
+
64
+ ### Recommendations
65
+
66
+ <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
67
+
68
+ Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
69
+
70
+ ## How to Get Started with the Model
71
+
72
+ Use the code below to get started with the model.
73
+
74
+ [More Information Needed]
75
+
76
+ ## Training Details
77
+
78
+ ### Training Data
79
+
80
+ <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
81
+
82
+ [More Information Needed]
83
+
84
+ ### Training Procedure
85
+
86
+ <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
87
+
88
+ #### Preprocessing [optional]
89
+
90
+ [More Information Needed]
91
+
92
+
93
+ #### Training Hyperparameters
94
+
95
+ - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
96
+
97
+ #### Speeds, Sizes, Times [optional]
98
+
99
+ <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
100
+
101
+ [More Information Needed]
102
+
103
+ ## Evaluation
104
+
105
+ <!-- This section describes the evaluation protocols and provides the results. -->
106
+
107
+ ### Testing Data, Factors & Metrics
108
+
109
+ #### Testing Data
110
+
111
+ <!-- This should link to a Dataset Card if possible. -->
112
+
113
+ [More Information Needed]
114
+
115
+ #### Factors
116
+
117
+ <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
118
+
119
+ [More Information Needed]
120
+
121
+ #### Metrics
122
+
123
+ <!-- These are the evaluation metrics being used, ideally with a description of why. -->
124
+
125
+ [More Information Needed]
126
+
127
+ ### Results
128
+
129
+ [More Information Needed]
130
+
131
+ #### Summary
132
+
133
+
134
+
135
+ ## Model Examination [optional]
136
+
137
+ <!-- Relevant interpretability work for the model goes here -->
138
+
139
+ [More Information Needed]
140
+
141
+ ## Environmental Impact
142
+
143
+ <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
144
+
145
+ Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
146
+
147
+ - **Hardware Type:** [More Information Needed]
148
+ - **Hours used:** [More Information Needed]
149
+ - **Cloud Provider:** [More Information Needed]
150
+ - **Compute Region:** [More Information Needed]
151
+ - **Carbon Emitted:** [More Information Needed]
152
+
153
+ ## Technical Specifications [optional]
154
+
155
+ ### Model Architecture and Objective
156
+
157
+ [More Information Needed]
158
+
159
+ ### Compute Infrastructure
160
+
161
+ [More Information Needed]
162
+
163
+ #### Hardware
164
+
165
+ [More Information Needed]
166
+
167
+ #### Software
168
+
169
+ [More Information Needed]
170
+
171
+ ## Citation [optional]
172
+
173
+ <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
174
+
175
+ **BibTeX:**
176
+
177
+ [More Information Needed]
178
+
179
+ **APA:**
180
+
181
+ [More Information Needed]
182
+
183
+ ## Glossary [optional]
184
+
185
+ <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
186
+
187
+ [More Information Needed]
188
+
189
+ ## More Information [optional]
190
+
191
+ [More Information Needed]
192
+
193
+ ## Model Card Authors [optional]
194
+
195
+ [More Information Needed]
196
+
197
+ ## Model Card Contact
198
+
199
+ [More Information Needed]
200
+ ### Framework versions
201
+
202
+ - PEFT 0.13.2
checkpoint-6250/adapter_config.json ADDED
@@ -0,0 +1,31 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "alpha_pattern": {},
3
+ "auto_mapping": {
4
+ "base_model_class": "GPT2LMHeadModel",
5
+ "parent_library": "transformers.models.gpt2.modeling_gpt2"
6
+ },
7
+ "base_model_name_or_path": "gpt2-medium",
8
+ "bias": "none",
9
+ "fan_in_fan_out": true,
10
+ "inference_mode": true,
11
+ "init_lora_weights": true,
12
+ "layer_replication": null,
13
+ "layers_pattern": null,
14
+ "layers_to_transform": null,
15
+ "loftq_config": {},
16
+ "lora_alpha": 128,
17
+ "lora_dropout": 0.0,
18
+ "megatron_config": null,
19
+ "megatron_core": "megatron.core",
20
+ "modules_to_save": null,
21
+ "peft_type": "LORA",
22
+ "r": 64,
23
+ "rank_pattern": {},
24
+ "revision": null,
25
+ "target_modules": [
26
+ "c_attn"
27
+ ],
28
+ "task_type": null,
29
+ "use_dora": false,
30
+ "use_rslora": false
31
+ }
checkpoint-6250/adapter_model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:7d6f323520015cd24beeccd43da58556f9d82fdbca183905225e94e4c27c49e2
3
+ size 25172088
checkpoint-6250/merges.txt ADDED
The diff for this file is too large to render. See raw diff
 
checkpoint-6250/optimizer.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:706c40d71ee81f476ca836c4f7427e9536b94b33edcc47c067353af2f308f586
3
+ size 50372538
checkpoint-6250/rng_state.pth ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:294700e5426ebab697d259a3cca3f780e2794ad9d399ce853512a6eff50a048e
3
+ size 14244
checkpoint-6250/scheduler.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:54bf5bc35983da2516a319daa8db116d3e18d36fcecad8a541a6fed01b6c2f20
3
+ size 1064
checkpoint-6250/special_tokens_map.json ADDED
@@ -0,0 +1,24 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "bos_token": {
3
+ "content": "<|endoftext|>",
4
+ "lstrip": false,
5
+ "normalized": true,
6
+ "rstrip": false,
7
+ "single_word": false
8
+ },
9
+ "eos_token": {
10
+ "content": "<|endoftext|>",
11
+ "lstrip": false,
12
+ "normalized": true,
13
+ "rstrip": false,
14
+ "single_word": false
15
+ },
16
+ "pad_token": "<|endoftext|>",
17
+ "unk_token": {
18
+ "content": "<|endoftext|>",
19
+ "lstrip": false,
20
+ "normalized": true,
21
+ "rstrip": false,
22
+ "single_word": false
23
+ }
24
+ }
checkpoint-6250/tokenizer_config.json ADDED
@@ -0,0 +1,22 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "add_bos_token": false,
3
+ "add_prefix_space": false,
4
+ "added_tokens_decoder": {
5
+ "50256": {
6
+ "content": "<|endoftext|>",
7
+ "lstrip": false,
8
+ "normalized": true,
9
+ "rstrip": false,
10
+ "single_word": false,
11
+ "special": true
12
+ }
13
+ },
14
+ "bos_token": "<|endoftext|>",
15
+ "clean_up_tokenization_spaces": false,
16
+ "eos_token": "<|endoftext|>",
17
+ "errors": "replace",
18
+ "model_max_length": 1024,
19
+ "pad_token": "<|endoftext|>",
20
+ "tokenizer_class": "GPT2Tokenizer",
21
+ "unk_token": "<|endoftext|>"
22
+ }
checkpoint-6250/trainer_state.json ADDED
The diff for this file is too large to render. See raw diff
 
checkpoint-6250/training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:cfade90fc53f4fc9f6f077f18cfaab66068bae3b4a1c448a94e0ef7e54a4479f
3
+ size 5304
checkpoint-6250/vocab.json ADDED
The diff for this file is too large to render. See raw diff
 
checkpoint-9375/README.md ADDED
@@ -0,0 +1,202 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ base_model: gpt2-medium
3
+ library_name: peft
4
+ ---
5
+
6
+ # Model Card for Model ID
7
+
8
+ <!-- Provide a quick summary of what the model is/does. -->
9
+
10
+
11
+
12
+ ## Model Details
13
+
14
+ ### Model Description
15
+
16
+ <!-- Provide a longer summary of what this model is. -->
17
+
18
+
19
+
20
+ - **Developed by:** [More Information Needed]
21
+ - **Funded by [optional]:** [More Information Needed]
22
+ - **Shared by [optional]:** [More Information Needed]
23
+ - **Model type:** [More Information Needed]
24
+ - **Language(s) (NLP):** [More Information Needed]
25
+ - **License:** [More Information Needed]
26
+ - **Finetuned from model [optional]:** [More Information Needed]
27
+
28
+ ### Model Sources [optional]
29
+
30
+ <!-- Provide the basic links for the model. -->
31
+
32
+ - **Repository:** [More Information Needed]
33
+ - **Paper [optional]:** [More Information Needed]
34
+ - **Demo [optional]:** [More Information Needed]
35
+
36
+ ## Uses
37
+
38
+ <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
39
+
40
+ ### Direct Use
41
+
42
+ <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
43
+
44
+ [More Information Needed]
45
+
46
+ ### Downstream Use [optional]
47
+
48
+ <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
49
+
50
+ [More Information Needed]
51
+
52
+ ### Out-of-Scope Use
53
+
54
+ <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
55
+
56
+ [More Information Needed]
57
+
58
+ ## Bias, Risks, and Limitations
59
+
60
+ <!-- This section is meant to convey both technical and sociotechnical limitations. -->
61
+
62
+ [More Information Needed]
63
+
64
+ ### Recommendations
65
+
66
+ <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
67
+
68
+ Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
69
+
70
+ ## How to Get Started with the Model
71
+
72
+ Use the code below to get started with the model.
73
+
74
+ [More Information Needed]
75
+
76
+ ## Training Details
77
+
78
+ ### Training Data
79
+
80
+ <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
81
+
82
+ [More Information Needed]
83
+
84
+ ### Training Procedure
85
+
86
+ <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
87
+
88
+ #### Preprocessing [optional]
89
+
90
+ [More Information Needed]
91
+
92
+
93
+ #### Training Hyperparameters
94
+
95
+ - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
96
+
97
+ #### Speeds, Sizes, Times [optional]
98
+
99
+ <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
100
+
101
+ [More Information Needed]
102
+
103
+ ## Evaluation
104
+
105
+ <!-- This section describes the evaluation protocols and provides the results. -->
106
+
107
+ ### Testing Data, Factors & Metrics
108
+
109
+ #### Testing Data
110
+
111
+ <!-- This should link to a Dataset Card if possible. -->
112
+
113
+ [More Information Needed]
114
+
115
+ #### Factors
116
+
117
+ <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
118
+
119
+ [More Information Needed]
120
+
121
+ #### Metrics
122
+
123
+ <!-- These are the evaluation metrics being used, ideally with a description of why. -->
124
+
125
+ [More Information Needed]
126
+
127
+ ### Results
128
+
129
+ [More Information Needed]
130
+
131
+ #### Summary
132
+
133
+
134
+
135
+ ## Model Examination [optional]
136
+
137
+ <!-- Relevant interpretability work for the model goes here -->
138
+
139
+ [More Information Needed]
140
+
141
+ ## Environmental Impact
142
+
143
+ <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
144
+
145
+ Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
146
+
147
+ - **Hardware Type:** [More Information Needed]
148
+ - **Hours used:** [More Information Needed]
149
+ - **Cloud Provider:** [More Information Needed]
150
+ - **Compute Region:** [More Information Needed]
151
+ - **Carbon Emitted:** [More Information Needed]
152
+
153
+ ## Technical Specifications [optional]
154
+
155
+ ### Model Architecture and Objective
156
+
157
+ [More Information Needed]
158
+
159
+ ### Compute Infrastructure
160
+
161
+ [More Information Needed]
162
+
163
+ #### Hardware
164
+
165
+ [More Information Needed]
166
+
167
+ #### Software
168
+
169
+ [More Information Needed]
170
+
171
+ ## Citation [optional]
172
+
173
+ <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
174
+
175
+ **BibTeX:**
176
+
177
+ [More Information Needed]
178
+
179
+ **APA:**
180
+
181
+ [More Information Needed]
182
+
183
+ ## Glossary [optional]
184
+
185
+ <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
186
+
187
+ [More Information Needed]
188
+
189
+ ## More Information [optional]
190
+
191
+ [More Information Needed]
192
+
193
+ ## Model Card Authors [optional]
194
+
195
+ [More Information Needed]
196
+
197
+ ## Model Card Contact
198
+
199
+ [More Information Needed]
200
+ ### Framework versions
201
+
202
+ - PEFT 0.13.2
checkpoint-9375/adapter_config.json ADDED
@@ -0,0 +1,31 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "alpha_pattern": {},
3
+ "auto_mapping": {
4
+ "base_model_class": "GPT2LMHeadModel",
5
+ "parent_library": "transformers.models.gpt2.modeling_gpt2"
6
+ },
7
+ "base_model_name_or_path": "gpt2-medium",
8
+ "bias": "none",
9
+ "fan_in_fan_out": true,
10
+ "inference_mode": true,
11
+ "init_lora_weights": true,
12
+ "layer_replication": null,
13
+ "layers_pattern": null,
14
+ "layers_to_transform": null,
15
+ "loftq_config": {},
16
+ "lora_alpha": 128,
17
+ "lora_dropout": 0.0,
18
+ "megatron_config": null,
19
+ "megatron_core": "megatron.core",
20
+ "modules_to_save": null,
21
+ "peft_type": "LORA",
22
+ "r": 64,
23
+ "rank_pattern": {},
24
+ "revision": null,
25
+ "target_modules": [
26
+ "c_attn"
27
+ ],
28
+ "task_type": null,
29
+ "use_dora": false,
30
+ "use_rslora": false
31
+ }
checkpoint-9375/adapter_model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:4203cb437ba2cf65b816c0da7440c59049742057c90e0fe86565cb3b31de1c7c
3
+ size 25172088
checkpoint-9375/merges.txt ADDED
The diff for this file is too large to render. See raw diff
 
checkpoint-9375/optimizer.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:cfe40db245c3c1bc48e15349c9338f4e3d20aaa7274743dee35426a5481dd19b
3
+ size 50372538
checkpoint-9375/rng_state.pth ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:246b9f0b5d50be9c9cd45261aff22fd5367b0e790cd82bdc7d7b9c2b771c72a6
3
+ size 14244
checkpoint-9375/scheduler.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:a17c6f8f90b7492d85e3ccb69e5e19d7c97f2aa89c2af5ae7acd1722212a18fd
3
+ size 1064
checkpoint-9375/special_tokens_map.json ADDED
@@ -0,0 +1,24 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "bos_token": {
3
+ "content": "<|endoftext|>",
4
+ "lstrip": false,
5
+ "normalized": true,
6
+ "rstrip": false,
7
+ "single_word": false
8
+ },
9
+ "eos_token": {
10
+ "content": "<|endoftext|>",
11
+ "lstrip": false,
12
+ "normalized": true,
13
+ "rstrip": false,
14
+ "single_word": false
15
+ },
16
+ "pad_token": "<|endoftext|>",
17
+ "unk_token": {
18
+ "content": "<|endoftext|>",
19
+ "lstrip": false,
20
+ "normalized": true,
21
+ "rstrip": false,
22
+ "single_word": false
23
+ }
24
+ }
checkpoint-9375/tokenizer_config.json ADDED
@@ -0,0 +1,22 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "add_bos_token": false,
3
+ "add_prefix_space": false,
4
+ "added_tokens_decoder": {
5
+ "50256": {
6
+ "content": "<|endoftext|>",
7
+ "lstrip": false,
8
+ "normalized": true,
9
+ "rstrip": false,
10
+ "single_word": false,
11
+ "special": true
12
+ }
13
+ },
14
+ "bos_token": "<|endoftext|>",
15
+ "clean_up_tokenization_spaces": false,
16
+ "eos_token": "<|endoftext|>",
17
+ "errors": "replace",
18
+ "model_max_length": 1024,
19
+ "pad_token": "<|endoftext|>",
20
+ "tokenizer_class": "GPT2Tokenizer",
21
+ "unk_token": "<|endoftext|>"
22
+ }
checkpoint-9375/trainer_state.json ADDED
The diff for this file is too large to render. See raw diff
 
checkpoint-9375/training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:cfade90fc53f4fc9f6f077f18cfaab66068bae3b4a1c448a94e0ef7e54a4479f
3
+ size 5304
checkpoint-9375/vocab.json ADDED
The diff for this file is too large to render. See raw diff
 
config.json ADDED
@@ -0,0 +1,40 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "gpt2-medium",
3
+ "activation_function": "gelu_new",
4
+ "architectures": [
5
+ "GPT2LMHeadModel"
6
+ ],
7
+ "attn_pdrop": 0.1,
8
+ "bos_token_id": 50256,
9
+ "embd_pdrop": 0.1,
10
+ "eos_token_id": 50256,
11
+ "initializer_range": 0.02,
12
+ "layer_norm_epsilon": 1e-05,
13
+ "model_type": "gpt2",
14
+ "n_ctx": 1024,
15
+ "n_embd": 1024,
16
+ "n_head": 16,
17
+ "n_inner": null,
18
+ "n_layer": 24,
19
+ "n_positions": 1024,
20
+ "n_special": 0,
21
+ "predict_special_tokens": true,
22
+ "reorder_and_upcast_attn": false,
23
+ "resid_pdrop": 0.1,
24
+ "scale_attn_by_inverse_layer_idx": false,
25
+ "scale_attn_weights": true,
26
+ "summary_activation": null,
27
+ "summary_first_dropout": 0.1,
28
+ "summary_proj_to_labels": true,
29
+ "summary_type": "cls_index",
30
+ "summary_use_proj": true,
31
+ "task_specific_params": {
32
+ "text-generation": {
33
+ "do_sample": true,
34
+ "max_length": 50
35
+ }
36
+ },
37
+ "transformers_version": "4.45.2",
38
+ "use_cache": true,
39
+ "vocab_size": 50257
40
+ }
merges.txt ADDED
The diff for this file is too large to render. See raw diff
 
special_tokens_map.json ADDED
@@ -0,0 +1,24 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "bos_token": {
3
+ "content": "<|endoftext|>",
4
+ "lstrip": false,
5
+ "normalized": true,
6
+ "rstrip": false,
7
+ "single_word": false
8
+ },
9
+ "eos_token": {
10
+ "content": "<|endoftext|>",
11
+ "lstrip": false,
12
+ "normalized": true,
13
+ "rstrip": false,
14
+ "single_word": false
15
+ },
16
+ "pad_token": "<|endoftext|>",
17
+ "unk_token": {
18
+ "content": "<|endoftext|>",
19
+ "lstrip": false,
20
+ "normalized": true,
21
+ "rstrip": false,
22
+ "single_word": false
23
+ }
24
+ }
tokenizer_config.json ADDED
@@ -0,0 +1,22 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "add_bos_token": false,
3
+ "add_prefix_space": false,
4
+ "added_tokens_decoder": {
5
+ "50256": {
6
+ "content": "<|endoftext|>",
7
+ "lstrip": false,
8
+ "normalized": true,
9
+ "rstrip": false,
10
+ "single_word": false,
11
+ "special": true
12
+ }
13
+ },
14
+ "bos_token": "<|endoftext|>",
15
+ "clean_up_tokenization_spaces": false,
16
+ "eos_token": "<|endoftext|>",
17
+ "errors": "replace",
18
+ "model_max_length": 1024,
19
+ "pad_token": "<|endoftext|>",
20
+ "tokenizer_class": "GPT2Tokenizer",
21
+ "unk_token": "<|endoftext|>"
22
+ }
training_logs.csv ADDED
@@ -0,0 +1,939 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ step,training_loss,grad_norm
2
+ 10,0.3558,0.32071906328201294
3
+ 20,0.3169,0.2794065475463867
4
+ 30,0.3206,0.25563815236091614
5
+ 40,0.3241,0.4417332410812378
6
+ 50,0.319,0.37645748257637024
7
+ 60,0.3153,0.3131222128868103
8
+ 70,0.3355,0.2674979567527771
9
+ 80,0.3323,0.3743484616279602
10
+ 90,0.2813,0.34157514572143555
11
+ 100,0.2786,0.5245288610458374
12
+ 110,0.2544,0.181500643491745
13
+ 120,0.2456,0.22667795419692993
14
+ 130,0.2398,0.244036465883255
15
+ 140,0.2506,0.12225145101547241
16
+ 150,0.2614,0.4552425742149353
17
+ 160,0.2477,0.18883612751960754
18
+ 170,0.2539,0.2779008746147156
19
+ 180,0.2483,0.138456329703331
20
+ 190,0.2315,0.12943269312381744
21
+ 200,0.2284,0.16588015854358673
22
+ 210,0.2268,0.2964286804199219
23
+ 220,0.2358,0.15502123534679413
24
+ 230,0.2267,0.22678083181381226
25
+ 240,0.2438,0.2653590440750122
26
+ 250,0.2205,0.15436983108520508
27
+ 260,0.2401,0.36812978982925415
28
+ 270,0.2275,0.23274561762809753
29
+ 280,0.2544,0.3396410644054413
30
+ 290,0.2081,0.11065305769443512
31
+ 300,0.2309,0.20382928848266602
32
+ 310,0.2236,0.472240686416626
33
+ 320,0.2043,0.1339893788099289
34
+ 330,0.2441,0.23823536932468414
35
+ 340,0.2392,0.12380514293909073
36
+ 350,0.2153,0.3463793694972992
37
+ 360,0.2013,0.1988895982503891
38
+ 370,0.2112,0.3201592266559601
39
+ 380,0.2098,0.1138683408498764
40
+ 390,0.2197,0.33196476101875305
41
+ 400,0.2155,0.2857055068016052
42
+ 410,0.2351,0.7026212215423584
43
+ 420,0.189,0.1584784835577011
44
+ 430,0.2174,0.3569210469722748
45
+ 440,0.2291,0.12185634672641754
46
+ 450,0.2248,0.20530372858047485
47
+ 460,0.2219,0.3720776438713074
48
+ 470,0.2138,0.2163982093334198
49
+ 480,0.2225,0.1827816218137741
50
+ 490,0.2148,0.3397406339645386
51
+ 500,0.2092,0.2826184034347534
52
+ 510,0.212,0.17477303743362427
53
+ 520,0.2027,0.14256270229816437
54
+ 530,0.2157,0.20557594299316406
55
+ 540,0.2142,0.19155050814151764
56
+ 550,0.2027,0.1431547850370407
57
+ 560,0.1996,0.17494995892047882
58
+ 570,0.2422,0.21769669651985168
59
+ 580,0.2158,0.5145639181137085
60
+ 590,0.2086,0.19497331976890564
61
+ 600,0.2093,0.211255744099617
62
+ 610,0.2148,0.4832595884799957
63
+ 620,0.2242,0.24792829155921936
64
+ 630,0.2035,0.48613137006759644
65
+ 640,0.1963,0.20214977860450745
66
+ 650,0.216,0.1867765188217163
67
+ 660,0.1842,0.27869710326194763
68
+ 670,0.2015,0.21029245853424072
69
+ 680,0.2071,0.18919135630130768
70
+ 690,0.1959,0.14328770339488983
71
+ 700,0.2015,0.23997552692890167
72
+ 710,0.2155,0.22771386802196503
73
+ 720,0.1917,0.15411153435707092
74
+ 730,0.2056,0.2147682160139084
75
+ 740,0.2049,0.22184568643569946
76
+ 750,0.2121,0.30972549319267273
77
+ 760,0.218,0.23781967163085938
78
+ 770,0.2019,0.16071753203868866
79
+ 780,0.2186,0.2155902236700058
80
+ 790,0.2066,0.5538204312324524
81
+ 800,0.2039,0.2163044959306717
82
+ 810,0.1986,0.16447103023529053
83
+ 820,0.1937,0.1273125559091568
84
+ 830,0.2191,0.12841354310512543
85
+ 840,0.1788,0.20279747247695923
86
+ 850,0.2197,0.26678866147994995
87
+ 860,0.2142,0.4068812429904938
88
+ 870,0.2058,0.25548458099365234
89
+ 880,0.2008,0.2944236099720001
90
+ 890,0.2014,0.2201596051454544
91
+ 900,0.1993,0.1315656304359436
92
+ 910,0.2134,0.5913531184196472
93
+ 920,0.217,0.22978711128234863
94
+ 930,0.2011,0.3520239293575287
95
+ 940,0.2007,0.15617959201335907
96
+ 950,0.1968,0.258504182100296
97
+ 960,0.2023,0.13370569050312042
98
+ 970,0.1989,0.14209233224391937
99
+ 980,0.1948,0.2141662985086441
100
+ 990,0.1897,0.26815688610076904
101
+ 1000,0.2066,0.6075823903083801
102
+ 1010,0.2087,0.1918363869190216
103
+ 1020,0.2011,0.371722936630249
104
+ 1030,0.1861,0.1037965714931488
105
+ 1040,0.1966,0.1006101593375206
106
+ 1050,0.2032,0.31892770528793335
107
+ 1060,0.2007,0.12998707592487335
108
+ 1070,0.2204,0.34640854597091675
109
+ 1080,0.1995,0.3480440080165863
110
+ 1090,0.2173,0.16862376034259796
111
+ 1100,0.1936,0.3368370831012726
112
+ 1110,0.1917,0.41733112931251526
113
+ 1120,0.2071,0.16220298409461975
114
+ 1130,0.2099,0.2908920645713806
115
+ 1140,0.2112,0.1940367966890335
116
+ 1150,0.2096,0.5486602783203125
117
+ 1160,0.211,0.36824262142181396
118
+ 1170,0.2067,0.1570533812046051
119
+ 1180,0.2051,0.17821206152439117
120
+ 1190,0.2185,0.23451495170593262
121
+ 1200,0.1854,0.13405710458755493
122
+ 1210,0.2037,0.25023525953292847
123
+ 1220,0.2015,0.08981674164533615
124
+ 1230,0.1838,0.1914534717798233
125
+ 1240,0.2195,0.15704376995563507
126
+ 1250,0.1919,0.39570775628089905
127
+ 1260,0.2101,0.17061889171600342
128
+ 1270,0.1927,0.1270769089460373
129
+ 1280,0.2023,0.15512141585350037
130
+ 1290,0.2035,0.266217976808548
131
+ 1300,0.2303,0.5519210696220398
132
+ 1310,0.1986,0.23824504017829895
133
+ 1320,0.2125,0.2041919231414795
134
+ 1330,0.209,0.4208974540233612
135
+ 1340,0.1876,0.14202351868152618
136
+ 1350,0.1967,0.3176891505718231
137
+ 1360,0.2025,0.16158701479434967
138
+ 1370,0.1883,0.12525491416454315
139
+ 1380,0.2193,0.2564144730567932
140
+ 1390,0.2042,0.19788813591003418
141
+ 1400,0.1954,0.18375135958194733
142
+ 1410,0.1966,0.3567545711994171
143
+ 1420,0.1921,0.3742877244949341
144
+ 1430,0.2086,0.4864203929901123
145
+ 1440,0.2219,0.5981997847557068
146
+ 1450,0.1856,0.15906871855258942
147
+ 1460,0.2173,0.41816794872283936
148
+ 1470,0.2009,0.18362511694431305
149
+ 1480,0.1921,0.15626972913742065
150
+ 1490,0.204,0.23124927282333374
151
+ 1500,0.191,0.11154084652662277
152
+ 1510,0.1899,0.3043041527271271
153
+ 1520,0.2047,0.32614457607269287
154
+ 1530,0.1918,0.2672986686229706
155
+ 1540,0.2092,0.17536531388759613
156
+ 1550,0.2188,0.2069392204284668
157
+ 1560,0.1877,0.12836001813411713
158
+ 1570,0.185,0.16351206600666046
159
+ 1580,0.1897,0.1282166838645935
160
+ 1590,0.1955,0.14068758487701416
161
+ 1600,0.2187,0.3121204674243927
162
+ 1610,0.1891,0.1809954047203064
163
+ 1620,0.2064,0.15994557738304138
164
+ 1630,0.2156,0.19730529189109802
165
+ 1640,0.2196,0.3479261100292206
166
+ 1650,0.1945,0.14109618961811066
167
+ 1660,0.1865,0.24992233514785767
168
+ 1670,0.1975,0.1633276343345642
169
+ 1680,0.1947,0.12861216068267822
170
+ 1690,0.1835,0.22408680617809296
171
+ 1700,0.1976,0.2990849018096924
172
+ 1710,0.1883,0.2218644618988037
173
+ 1720,0.1949,0.26112663745880127
174
+ 1730,0.1854,0.33678847551345825
175
+ 1740,0.1985,0.1794002503156662
176
+ 1750,0.2192,0.49004507064819336
177
+ 1760,0.211,0.1695421189069748
178
+ 1770,0.1806,0.134551540017128
179
+ 1780,0.2082,0.14868737757205963
180
+ 1790,0.1986,0.33741533756256104
181
+ 1800,0.2016,0.18197348713874817
182
+ 1810,0.1937,0.2026369273662567
183
+ 1820,0.1972,0.17229503393173218
184
+ 1830,0.2166,0.5893936157226562
185
+ 1840,0.1995,0.18323490023612976
186
+ 1850,0.1864,0.1333731710910797
187
+ 1860,0.1909,0.141704261302948
188
+ 1870,0.1977,0.3868229389190674
189
+ 1880,0.1979,0.2665703594684601
190
+ 1890,0.1976,0.23620040714740753
191
+ 1900,0.1943,0.23175929486751556
192
+ 1910,0.1952,0.17958831787109375
193
+ 1920,0.2042,0.3064885139465332
194
+ 1930,0.2159,0.14014050364494324
195
+ 1940,0.2077,0.11555980890989304
196
+ 1950,0.1943,0.24481888115406036
197
+ 1960,0.195,0.26040998101234436
198
+ 1970,0.2026,0.3638281524181366
199
+ 1980,0.1794,0.14876505732536316
200
+ 1990,0.1859,0.26840996742248535
201
+ 2000,0.1998,0.23965291678905487
202
+ 2010,0.1928,0.14806678891181946
203
+ 2020,0.2104,0.27742016315460205
204
+ 2030,0.1774,0.18194366991519928
205
+ 2040,0.1916,0.12873081862926483
206
+ 2050,0.1969,0.46400418877601624
207
+ 2060,0.1977,0.29666176438331604
208
+ 2070,0.1901,0.1975684016942978
209
+ 2080,0.1839,0.12535278499126434
210
+ 2090,0.1889,0.1096622422337532
211
+ 2100,0.1912,0.19239233434200287
212
+ 2110,0.2143,0.10907316207885742
213
+ 2120,0.1993,0.1855459362268448
214
+ 2130,0.1934,0.15058843791484833
215
+ 2140,0.1791,0.1441117525100708
216
+ 2150,0.1947,0.11171973496675491
217
+ 2160,0.1933,0.14469866454601288
218
+ 2170,0.2048,0.26807868480682373
219
+ 2180,0.1722,0.13813795149326324
220
+ 2190,0.1912,0.2016972154378891
221
+ 2200,0.1961,0.1609937995672226
222
+ 2210,0.1811,0.14046931266784668
223
+ 2220,0.2072,0.33526933193206787
224
+ 2230,0.1786,0.1615295708179474
225
+ 2240,0.1922,0.34492355585098267
226
+ 2250,0.211,0.163607656955719
227
+ 2260,0.2107,0.2812877297401428
228
+ 2270,0.1917,0.42100223898887634
229
+ 2280,0.1918,0.27094176411628723
230
+ 2290,0.1854,0.37242257595062256
231
+ 2300,0.1827,0.21011948585510254
232
+ 2310,0.202,0.12402696162462234
233
+ 2320,0.1923,0.14612217247486115
234
+ 2330,0.1867,0.24147266149520874
235
+ 2340,0.2005,0.21704956889152527
236
+ 2350,0.1904,0.13054011762142181
237
+ 2360,0.2091,0.20976080000400543
238
+ 2370,0.1819,0.1872418373823166
239
+ 2380,0.1961,0.11212967336177826
240
+ 2390,0.2068,0.519498884677887
241
+ 2400,0.193,0.1346292942762375
242
+ 2410,0.1861,0.0962454080581665
243
+ 2420,0.1807,0.12570427358150482
244
+ 2430,0.1796,0.16203445196151733
245
+ 2440,0.1987,0.253316193819046
246
+ 2450,0.2074,0.27736690640449524
247
+ 2460,0.1954,0.29609182476997375
248
+ 2470,0.1956,0.1255633682012558
249
+ 2480,0.2004,0.15914912521839142
250
+ 2490,0.205,0.25490802526474
251
+ 2500,0.2187,0.24118031561374664
252
+ 2510,0.1896,0.18467594683170319
253
+ 2520,0.2042,0.20041191577911377
254
+ 2530,0.2206,0.12200062721967697
255
+ 2540,0.1939,0.5114597678184509
256
+ 2550,0.1983,0.2055499255657196
257
+ 2560,0.2125,0.143922820687294
258
+ 2570,0.2041,0.2345764935016632
259
+ 2580,0.1983,0.14161275327205658
260
+ 2590,0.1896,0.4477356970310211
261
+ 2600,0.2012,0.32437455654144287
262
+ 2610,0.2238,0.24398133158683777
263
+ 2620,0.1869,0.481764554977417
264
+ 2630,0.1811,0.14867210388183594
265
+ 2640,0.2152,0.13860155642032623
266
+ 2650,0.2209,0.26865389943122864
267
+ 2660,0.2056,0.19769036769866943
268
+ 2670,0.1852,0.08871612697839737
269
+ 2680,0.1861,0.12152489274740219
270
+ 2690,0.1978,0.22325195372104645
271
+ 2700,0.2172,0.39803528785705566
272
+ 2710,0.1911,0.16201449930667877
273
+ 2720,0.2146,0.08962103724479675
274
+ 2730,0.1866,0.2011992633342743
275
+ 2740,0.1886,0.3395402729511261
276
+ 2750,0.192,0.19809773564338684
277
+ 2760,0.2264,0.1411229819059372
278
+ 2770,0.2177,0.4905540645122528
279
+ 2780,0.2011,0.26518192887306213
280
+ 2790,0.197,0.22682157158851624
281
+ 2800,0.2008,0.20984944701194763
282
+ 2810,0.2017,0.09985147416591644
283
+ 2820,0.1953,0.13415023684501648
284
+ 2830,0.2037,0.11969766020774841
285
+ 2840,0.1947,0.1487891525030136
286
+ 2850,0.1962,0.4217297434806824
287
+ 2860,0.1886,0.13667447865009308
288
+ 2870,0.1934,0.3455916941165924
289
+ 2880,0.2089,0.18731869757175446
290
+ 2890,0.1978,0.14637391269207
291
+ 2900,0.1908,0.19625739753246307
292
+ 2910,0.1874,0.27000346779823303
293
+ 2920,0.2083,0.28899985551834106
294
+ 2930,0.2142,0.3836462199687958
295
+ 2940,0.1901,0.1850130259990692
296
+ 2950,0.2037,0.141523540019989
297
+ 2960,0.1905,0.3251464366912842
298
+ 2970,0.2095,0.1748805046081543
299
+ 2980,0.1868,0.14634093642234802
300
+ 2990,0.1977,0.36464419960975647
301
+ 3000,0.186,0.09717480093240738
302
+ 3010,0.1843,0.12364242225885391
303
+ 3020,0.1971,0.23551173508167267
304
+ 3030,0.2137,0.22094455361366272
305
+ 3040,0.1923,0.20646634697914124
306
+ 3050,0.1999,0.16831813752651215
307
+ 3060,0.1982,0.23634643852710724
308
+ 3070,0.1882,0.14036402106285095
309
+ 3080,0.1867,0.16198360919952393
310
+ 3090,0.1857,0.15361471474170685
311
+ 3100,0.1869,0.19663187861442566
312
+ 3110,0.1881,0.11526310443878174
313
+ 3120,0.1814,0.22192062437534332
314
+ 3130,0.2055,0.12529288232326508
315
+ 3140,0.1979,0.42167556285858154
316
+ 3150,0.1968,0.12197548151016235
317
+ 3160,0.2165,0.19705833494663239
318
+ 3170,0.2032,0.11476973444223404
319
+ 3180,0.1935,0.1344933658838272
320
+ 3190,0.1948,0.18272902071475983
321
+ 3200,0.1973,0.09915751963853836
322
+ 3210,0.1902,0.21683669090270996
323
+ 3220,0.2329,0.3918346166610718
324
+ 3230,0.1909,0.18044057488441467
325
+ 3240,0.1975,0.32754865288734436
326
+ 3250,0.1904,0.4130847454071045
327
+ 3260,0.1975,0.34668681025505066
328
+ 3270,0.1999,0.16089856624603271
329
+ 3280,0.193,0.21584224700927734
330
+ 3290,0.1939,0.16810452938079834
331
+ 3300,0.2048,0.17121155560016632
332
+ 3310,0.2165,0.1404232680797577
333
+ 3320,0.1985,0.14396339654922485
334
+ 3330,0.1918,0.20105883479118347
335
+ 3340,0.2126,0.18435832858085632
336
+ 3350,0.2062,0.14441746473312378
337
+ 3360,0.188,0.2610270380973816
338
+ 3370,0.183,0.1223735362291336
339
+ 3380,0.2119,0.2614642083644867
340
+ 3390,0.1949,0.1186862513422966
341
+ 3400,0.1967,0.2570422291755676
342
+ 3410,0.1911,0.22830010950565338
343
+ 3420,0.1952,0.20854175090789795
344
+ 3430,0.1886,0.13073670864105225
345
+ 3440,0.1911,0.15211446583271027
346
+ 3450,0.1825,0.21212461590766907
347
+ 3460,0.192,0.24107690155506134
348
+ 3470,0.1984,0.2833156883716583
349
+ 3480,0.1881,0.11552239954471588
350
+ 3490,0.1865,0.6871241927146912
351
+ 3500,0.1881,0.28000256419181824
352
+ 3510,0.2017,0.5156084299087524
353
+ 3520,0.1899,0.4344501197338104
354
+ 3530,0.1898,0.20910920202732086
355
+ 3540,0.1875,0.31540071964263916
356
+ 3550,0.1783,0.22151772677898407
357
+ 3560,0.1832,0.23169292509555817
358
+ 3570,0.1897,0.21004840731620789
359
+ 3580,0.1794,0.1364370435476303
360
+ 3590,0.1798,0.18477840721607208
361
+ 3600,0.1828,0.18409843742847443
362
+ 3610,0.1868,0.14838065207004547
363
+ 3620,0.1915,0.1855224221944809
364
+ 3630,0.1931,0.132468119263649
365
+ 3640,0.1892,0.1399061232805252
366
+ 3650,0.2153,0.14445742964744568
367
+ 3660,0.2112,0.1145782321691513
368
+ 3670,0.193,0.13386286795139313
369
+ 3680,0.1989,0.25447431206703186
370
+ 3690,0.2072,0.3270455598831177
371
+ 3700,0.201,0.11990506201982498
372
+ 3710,0.1903,0.12764547765254974
373
+ 3720,0.2062,0.40525418519973755
374
+ 3730,0.2144,0.3391278088092804
375
+ 3740,0.2112,0.11917085945606232
376
+ 3750,0.234,0.12978479266166687
377
+ 3760,0.1946,0.22144345939159393
378
+ 3770,0.1904,0.20441490411758423
379
+ 3780,0.1847,0.2381419688463211
380
+ 3790,0.1951,0.1428823173046112
381
+ 3800,0.1939,0.4492218494415283
382
+ 3810,0.1983,0.21098877489566803
383
+ 3820,0.2031,0.10859962552785873
384
+ 3830,0.1908,0.1391225904226303
385
+ 3840,0.1833,0.1572648286819458
386
+ 3850,0.1831,0.15222427248954773
387
+ 3860,0.2141,0.4904521107673645
388
+ 3870,0.1877,0.2860592007637024
389
+ 3880,0.1939,0.12134332954883575
390
+ 3890,0.1853,0.22850103676319122
391
+ 3900,0.1978,0.228472039103508
392
+ 3910,0.2006,0.10981755703687668
393
+ 3920,0.1884,0.23318102955818176
394
+ 3930,0.1991,0.3726746439933777
395
+ 3940,0.1959,0.26017436385154724
396
+ 3950,0.1917,0.1348070204257965
397
+ 3960,0.2079,0.23967556655406952
398
+ 3970,0.1974,0.6242204904556274
399
+ 3980,0.2171,0.13600404560565948
400
+ 3990,0.1791,0.20679210126399994
401
+ 4000,0.194,0.10915413498878479
402
+ 4010,0.1897,0.3919573426246643
403
+ 4020,0.2141,0.19863814115524292
404
+ 4030,0.1939,0.2351231426000595
405
+ 4040,0.1915,0.18652284145355225
406
+ 4050,0.1954,0.19517678022384644
407
+ 4060,0.1891,0.17028377950191498
408
+ 4070,0.196,0.21571074426174164
409
+ 4080,0.1902,0.20602403581142426
410
+ 4090,0.1895,0.2475835084915161
411
+ 4100,0.1956,0.4894787073135376
412
+ 4110,0.1808,0.41957294940948486
413
+ 4120,0.1845,0.2791914939880371
414
+ 4130,0.1998,0.26293325424194336
415
+ 4140,0.1851,0.31361421942710876
416
+ 4150,0.2119,0.11789387464523315
417
+ 4160,0.1889,0.12928029894828796
418
+ 4170,0.1822,0.16677114367485046
419
+ 4180,0.1774,0.26217782497406006
420
+ 4190,0.1855,0.173801988363266
421
+ 4200,0.1969,0.2360815554857254
422
+ 4210,0.1925,0.18878477811813354
423
+ 4220,0.1981,0.7121212482452393
424
+ 4230,0.1988,0.2006293386220932
425
+ 4240,0.1884,0.15101180970668793
426
+ 4250,0.1882,0.2716868817806244
427
+ 4260,0.1916,0.3079506754875183
428
+ 4270,0.1982,0.1469135731458664
429
+ 4280,0.2037,0.14991195499897003
430
+ 4290,0.1977,0.18982867896556854
431
+ 4300,0.2097,0.3601289987564087
432
+ 4310,0.1881,0.19968882203102112
433
+ 4320,0.1976,0.33335718512535095
434
+ 4330,0.1843,0.3473576009273529
435
+ 4340,0.1865,0.21573849022388458
436
+ 4350,0.2039,0.2450680285692215
437
+ 4360,0.1766,0.13361075520515442
438
+ 4370,0.1927,0.15630079805850983
439
+ 4380,0.1863,0.16141249239444733
440
+ 4390,0.2026,0.14967168867588043
441
+ 4400,0.1777,0.1990043669939041
442
+ 4410,0.2004,0.35991206765174866
443
+ 4420,0.1951,0.18987154960632324
444
+ 4430,0.1771,0.15781012177467346
445
+ 4440,0.1977,0.23537319898605347
446
+ 4450,0.1961,0.30584484338760376
447
+ 4460,0.209,0.19641929864883423
448
+ 4470,0.1897,0.19731459021568298
449
+ 4480,0.1771,0.15782621502876282
450
+ 4490,0.1924,0.2217131108045578
451
+ 4500,0.1835,0.2667541205883026
452
+ 4510,0.19,0.1263836920261383
453
+ 4520,0.2043,0.1596195250749588
454
+ 4530,0.207,0.16072484850883484
455
+ 4540,0.1862,0.44382864236831665
456
+ 4550,0.1888,0.1299673169851303
457
+ 4560,0.212,0.2095647007226944
458
+ 4570,0.1965,0.1726602464914322
459
+ 4580,0.1823,0.1426515132188797
460
+ 4590,0.2018,0.23668938875198364
461
+ 4600,0.2018,0.13374629616737366
462
+ 4610,0.174,0.08905677497386932
463
+ 4620,0.203,0.1713249832391739
464
+ 4630,0.1871,0.17429092526435852
465
+ 4640,0.1844,0.2214651256799698
466
+ 4650,0.1933,0.10598839819431305
467
+ 4660,0.1977,0.15706612169742584
468
+ 4670,0.193,0.24813860654830933
469
+ 4680,0.1803,0.32129591703414917
470
+ 4690,0.2034,0.3526042401790619
471
+ 4700,0.2032,0.31708043813705444
472
+ 4710,0.1862,0.4131239056587219
473
+ 4720,0.1963,0.3810931444168091
474
+ 4730,0.1867,0.13151924312114716
475
+ 4740,0.208,0.2725364565849304
476
+ 4750,0.1898,0.22126230597496033
477
+ 4760,0.2123,0.13343718647956848
478
+ 4770,0.1893,0.13617506623268127
479
+ 4780,0.1754,0.18784309923648834
480
+ 4790,0.1769,0.24734441936016083
481
+ 4800,0.1853,0.13228961825370789
482
+ 4810,0.2186,0.2793158292770386
483
+ 4820,0.1944,0.12285496294498444
484
+ 4830,0.1814,0.16356608271598816
485
+ 4840,0.2068,0.20437662303447723
486
+ 4850,0.1961,0.13596777617931366
487
+ 4860,0.2036,0.14505477249622345
488
+ 4870,0.1804,0.1674296259880066
489
+ 4880,0.1913,0.09710420668125153
490
+ 4890,0.2029,0.17767125368118286
491
+ 4900,0.1835,0.3481147885322571
492
+ 4910,0.1991,0.5457572340965271
493
+ 4920,0.1875,0.4577218294143677
494
+ 4930,0.1807,0.2871100902557373
495
+ 4940,0.1957,0.3030782639980316
496
+ 4950,0.1795,0.23293675482273102
497
+ 4960,0.1962,0.22913801670074463
498
+ 4970,0.1894,0.12194394320249557
499
+ 4980,0.1932,0.46661141514778137
500
+ 4990,0.2025,0.45983535051345825
501
+ 5000,0.1819,0.15152078866958618
502
+ 5010,0.2082,0.39475178718566895
503
+ 5020,0.1944,0.21548931300640106
504
+ 5030,0.1827,0.11637167632579803
505
+ 5040,0.188,0.27838513255119324
506
+ 5050,0.1789,0.23179729282855988
507
+ 5060,0.1977,0.22427043318748474
508
+ 5070,0.1892,0.36368921399116516
509
+ 5080,0.1923,0.1129203587770462
510
+ 5090,0.1755,0.14689409732818604
511
+ 5100,0.1761,0.16220401227474213
512
+ 5110,0.2053,0.3488210141658783
513
+ 5120,0.1733,0.37989169359207153
514
+ 5130,0.1968,0.19538883864879608
515
+ 5140,0.2159,0.3875751793384552
516
+ 5150,0.1975,0.31660810112953186
517
+ 5160,0.222,0.17016595602035522
518
+ 5170,0.18,0.23723351955413818
519
+ 5180,0.1776,0.1705704629421234
520
+ 5190,0.1892,0.17754660546779633
521
+ 5200,0.1893,0.12749704718589783
522
+ 5210,0.191,0.1611369550228119
523
+ 5220,0.1822,0.11559612303972244
524
+ 5230,0.1986,0.10376226902008057
525
+ 5240,0.2185,0.35661613941192627
526
+ 5250,0.2186,0.2742044925689697
527
+ 5260,0.2017,0.34222495555877686
528
+ 5270,0.1942,0.11810880899429321
529
+ 5280,0.1865,0.19386634230613708
530
+ 5290,0.1965,0.23785775899887085
531
+ 5300,0.1781,0.151785209774971
532
+ 5310,0.1825,0.10795265436172485
533
+ 5320,0.1839,0.29830363392829895
534
+ 5330,0.2209,0.24082113802433014
535
+ 5340,0.1904,0.12022729218006134
536
+ 5350,0.1937,0.22289042174816132
537
+ 5360,0.2109,0.12292595207691193
538
+ 5370,0.1904,0.3426815867424011
539
+ 5380,0.1984,0.15332967042922974
540
+ 5390,0.2034,0.22630643844604492
541
+ 5400,0.1755,0.0931999608874321
542
+ 5410,0.1969,0.25632500648498535
543
+ 5420,0.1797,0.18699921667575836
544
+ 5430,0.1882,0.22163766622543335
545
+ 5440,0.1748,0.13015256822109222
546
+ 5450,0.2005,0.1933586299419403
547
+ 5460,0.1904,0.2721489667892456
548
+ 5470,0.1997,0.32976672053337097
549
+ 5480,0.1981,0.10539891570806503
550
+ 5490,0.1891,0.3012673556804657
551
+ 5500,0.2081,0.20593006908893585
552
+ 5510,0.1988,0.21603475511074066
553
+ 5520,0.1887,0.2905438542366028
554
+ 5530,0.1886,0.1223142221570015
555
+ 5540,0.1868,0.37161391973495483
556
+ 5550,0.1858,0.10761993378400803
557
+ 5560,0.1911,0.30840712785720825
558
+ 5570,0.1927,0.12379074096679688
559
+ 5580,0.1866,0.41834160685539246
560
+ 5590,0.2197,0.1799428015947342
561
+ 5600,0.1841,0.12476979941129684
562
+ 5610,0.1809,0.25949403643608093
563
+ 5620,0.1977,0.4696250855922699
564
+ 5630,0.196,0.2974794805049896
565
+ 5640,0.1784,0.11155738681554794
566
+ 5650,0.1811,0.3147721290588379
567
+ 5660,0.2022,0.2068697065114975
568
+ 5670,0.1875,0.12708576023578644
569
+ 5680,0.1862,0.15333087742328644
570
+ 5690,0.2008,0.14552031457424164
571
+ 5700,0.2152,0.22572584450244904
572
+ 5710,0.2007,0.19288545846939087
573
+ 5720,0.2046,0.16341878473758698
574
+ 5730,0.1923,0.1080252081155777
575
+ 5740,0.2101,0.2719382047653198
576
+ 5750,0.1689,0.2168358713388443
577
+ 5760,0.179,0.31658855080604553
578
+ 5770,0.1702,0.31429606676101685
579
+ 5780,0.1714,0.16168814897537231
580
+ 5790,0.1806,0.2045605480670929
581
+ 5800,0.1921,0.1508999913930893
582
+ 5810,0.2014,0.11083026975393295
583
+ 5820,0.1887,0.1291135549545288
584
+ 5830,0.1841,0.2196565568447113
585
+ 5840,0.1976,0.178413987159729
586
+ 5850,0.186,0.18053841590881348
587
+ 5860,0.1849,0.11606927216053009
588
+ 5870,0.1893,0.10893714427947998
589
+ 5880,0.2041,0.15770243108272552
590
+ 5890,0.1918,0.17247073352336884
591
+ 5900,0.1975,0.14984384179115295
592
+ 5910,0.1838,0.27113232016563416
593
+ 5920,0.1959,0.31966206431388855
594
+ 5930,0.1911,0.1655171811580658
595
+ 5940,0.2015,0.4618909955024719
596
+ 5950,0.181,0.17289946973323822
597
+ 5960,0.1805,0.18293440341949463
598
+ 5970,0.2165,0.2430201917886734
599
+ 5980,0.1858,0.21155020594596863
600
+ 5990,0.2011,0.10135482996702194
601
+ 6000,0.2064,0.32442012429237366
602
+ 6010,0.1947,0.2617526054382324
603
+ 6020,0.194,0.1327558159828186
604
+ 6030,0.2117,0.36135196685791016
605
+ 6040,0.1733,0.1543770283460617
606
+ 6050,0.1859,0.11700884252786636
607
+ 6060,0.1919,0.1485334038734436
608
+ 6070,0.1844,0.19141431152820587
609
+ 6080,0.2285,0.19183652102947235
610
+ 6090,0.1989,0.1868680715560913
611
+ 6100,0.2001,0.16714945435523987
612
+ 6110,0.2014,0.23409946262836456
613
+ 6120,0.1896,0.2153204083442688
614
+ 6130,0.1895,0.12315434217453003
615
+ 6140,0.192,0.185602605342865
616
+ 6150,0.1858,0.1360010802745819
617
+ 6160,0.192,0.15563714504241943
618
+ 6170,0.1706,0.21236270666122437
619
+ 6180,0.2014,0.4081701338291168
620
+ 6190,0.1931,0.18205949664115906
621
+ 6200,0.2058,0.3085598945617676
622
+ 6210,0.1831,0.4010489881038666
623
+ 6220,0.1904,0.14054065942764282
624
+ 6230,0.205,0.14924605190753937
625
+ 6240,0.1735,0.11415160447359085
626
+ 6250,0.1862,0.3156155049800873
627
+ 6260,0.1758,0.19247853755950928
628
+ 6270,0.1757,0.3346346616744995
629
+ 6280,0.1732,0.12410330772399902
630
+ 6290,0.1854,0.1532019078731537
631
+ 6300,0.1955,0.12374981492757797
632
+ 6310,0.1978,0.20005425810813904
633
+ 6320,0.1936,0.13402165472507477
634
+ 6330,0.1983,0.4172665476799011
635
+ 6340,0.1879,0.1336435228586197
636
+ 6350,0.196,0.3286299407482147
637
+ 6360,0.1957,0.32011619210243225
638
+ 6370,0.1867,0.2597643733024597
639
+ 6380,0.2017,0.2771315574645996
640
+ 6390,0.1839,0.19237180054187775
641
+ 6400,0.181,0.29834896326065063
642
+ 6410,0.1832,0.1240164116024971
643
+ 6420,0.206,0.2248154878616333
644
+ 6430,0.1908,0.1138843446969986
645
+ 6440,0.2111,0.40525248646736145
646
+ 6450,0.2051,0.14319169521331787
647
+ 6460,0.2001,0.1894819438457489
648
+ 6470,0.1841,0.1754332333803177
649
+ 6480,0.1959,0.19362223148345947
650
+ 6490,0.1799,0.20186428725719452
651
+ 6500,0.1931,0.5195144414901733
652
+ 6510,0.185,0.19907741248607635
653
+ 6520,0.19,0.16064997017383575
654
+ 6530,0.1936,0.1789844036102295
655
+ 6540,0.1953,0.26900771260261536
656
+ 6550,0.1855,0.14407795667648315
657
+ 6560,0.1869,0.3342307209968567
658
+ 6570,0.1865,0.38254573941230774
659
+ 6580,0.1793,0.314861536026001
660
+ 6590,0.2028,0.28199201822280884
661
+ 6600,0.1845,0.19082097709178925
662
+ 6610,0.1961,0.18809722363948822
663
+ 6620,0.1784,0.09436540305614471
664
+ 6630,0.1841,0.2130885273218155
665
+ 6640,0.1803,0.15509673953056335
666
+ 6650,0.195,0.3256995379924774
667
+ 6660,0.1953,0.16595757007598877
668
+ 6670,0.216,0.151058629155159
669
+ 6680,0.2034,0.12330947816371918
670
+ 6690,0.1974,0.26234740018844604
671
+ 6700,0.1917,0.4495633542537689
672
+ 6710,0.1939,0.2204718142747879
673
+ 6720,0.1859,0.22135566174983978
674
+ 6730,0.1966,0.1888224184513092
675
+ 6740,0.1888,0.1313880831003189
676
+ 6750,0.1842,0.3557686507701874
677
+ 6760,0.2169,0.23171742260456085
678
+ 6770,0.1917,0.30755820870399475
679
+ 6780,0.1939,0.2494126707315445
680
+ 6790,0.1811,0.19220732152462006
681
+ 6800,0.1901,0.2093030959367752
682
+ 6810,0.1994,0.4100380539894104
683
+ 6820,0.212,0.1432579904794693
684
+ 6830,0.1985,0.29103121161460876
685
+ 6840,0.182,0.1525057703256607
686
+ 6850,0.1978,0.16088923811912537
687
+ 6860,0.1941,0.09459532797336578
688
+ 6870,0.1848,0.10758478939533234
689
+ 6880,0.1866,0.11241775751113892
690
+ 6890,0.1721,0.3859238922595978
691
+ 6900,0.2114,0.09421907365322113
692
+ 6910,0.191,0.15955796837806702
693
+ 6920,0.2016,0.19874829053878784
694
+ 6930,0.1842,0.1385822594165802
695
+ 6940,0.1834,0.32099974155426025
696
+ 6950,0.1719,0.33326661586761475
697
+ 6960,0.1871,0.15569132566452026
698
+ 6970,0.1901,0.24923570454120636
699
+ 6980,0.1873,0.16264751553535461
700
+ 6990,0.1939,0.1684579998254776
701
+ 7000,0.2016,0.40998804569244385
702
+ 7010,0.1881,0.2120629996061325
703
+ 7020,0.195,0.1460552215576172
704
+ 7030,0.1932,0.1706569939851761
705
+ 7040,0.201,0.18198959529399872
706
+ 7050,0.1858,0.3355713188648224
707
+ 7060,0.1894,0.28271225094795227
708
+ 7070,0.2079,0.21678537130355835
709
+ 7080,0.1978,0.1644618958234787
710
+ 7090,0.1962,0.43363186717033386
711
+ 7100,0.2063,0.12593208253383636
712
+ 7110,0.1872,0.23067864775657654
713
+ 7120,0.1988,0.21921464800834656
714
+ 7130,0.1924,0.31100329756736755
715
+ 7140,0.1917,0.14324329793453217
716
+ 7150,0.2103,0.1136397197842598
717
+ 7160,0.1991,0.12539143860340118
718
+ 7170,0.1968,0.14862652122974396
719
+ 7180,0.1998,0.18603764474391937
720
+ 7190,0.189,0.2600877285003662
721
+ 7200,0.1896,0.2621091306209564
722
+ 7210,0.196,0.3635103404521942
723
+ 7220,0.1895,0.1914694756269455
724
+ 7230,0.1922,0.22275245189666748
725
+ 7240,0.1915,0.21839484572410583
726
+ 7250,0.1813,0.20325468480587006
727
+ 7260,0.1924,0.20031067728996277
728
+ 7270,0.188,0.11527217924594879
729
+ 7280,0.1769,0.18948520720005035
730
+ 7290,0.1947,0.3682481050491333
731
+ 7300,0.1895,0.1695318967103958
732
+ 7310,0.1822,0.26400190591812134
733
+ 7320,0.1868,0.11796242743730545
734
+ 7330,0.1847,0.5708523392677307
735
+ 7340,0.1982,0.1770615428686142
736
+ 7350,0.1944,0.524164080619812
737
+ 7360,0.1851,0.19294540584087372
738
+ 7370,0.2099,0.34142595529556274
739
+ 7380,0.1988,0.14007115364074707
740
+ 7390,0.1928,0.15258093178272247
741
+ 7400,0.1943,0.19033628702163696
742
+ 7410,0.1848,0.16659627854824066
743
+ 7420,0.1917,0.20548418164253235
744
+ 7430,0.1987,0.18582993745803833
745
+ 7440,0.2079,0.4724338948726654
746
+ 7450,0.2016,0.3675433397293091
747
+ 7460,0.1971,0.4193277060985565
748
+ 7470,0.2127,0.15622088313102722
749
+ 7480,0.1924,0.3101946711540222
750
+ 7490,0.1847,0.3204960525035858
751
+ 7500,0.207,0.21637819707393646
752
+ 7510,0.1899,0.10439030081033707
753
+ 7520,0.1898,0.23369431495666504
754
+ 7530,0.2053,0.217510387301445
755
+ 7540,0.1985,0.12691344320774078
756
+ 7550,0.1902,0.17355982959270477
757
+ 7560,0.1912,0.16318906843662262
758
+ 7570,0.1865,0.13026556372642517
759
+ 7580,0.1897,0.21534563601016998
760
+ 7590,0.1835,0.1155862808227539
761
+ 7600,0.1787,0.09331275522708893
762
+ 7610,0.1864,0.262445867061615
763
+ 7620,0.1808,0.13018622994422913
764
+ 7630,0.1871,0.09540057182312012
765
+ 7640,0.1799,0.18739309906959534
766
+ 7650,0.1996,0.33603695034980774
767
+ 7660,0.173,0.15169212222099304
768
+ 7670,0.1964,0.22183649241924286
769
+ 7680,0.1944,0.22191403806209564
770
+ 7690,0.1925,0.10815884172916412
771
+ 7700,0.1979,0.11126530170440674
772
+ 7710,0.2052,0.4693167209625244
773
+ 7720,0.1735,0.22466649115085602
774
+ 7730,0.1955,0.26078879833221436
775
+ 7740,0.1716,0.18148629367351532
776
+ 7750,0.1915,0.14516060054302216
777
+ 7760,0.1765,0.2591620087623596
778
+ 7770,0.2009,0.13933546841144562
779
+ 7780,0.175,0.13252204656600952
780
+ 7790,0.1787,0.19723990559577942
781
+ 7800,0.1999,0.20300233364105225
782
+ 7810,0.1953,0.19206602871418
783
+ 7820,0.1882,0.2578635513782501
784
+ 7830,0.202,0.18145211040973663
785
+ 7840,0.1816,0.3032558858394623
786
+ 7850,0.1826,0.13542884588241577
787
+ 7860,0.1681,0.4135322570800781
788
+ 7870,0.1867,0.13162481784820557
789
+ 7880,0.1878,0.20694993436336517
790
+ 7890,0.1839,0.12554612755775452
791
+ 7900,0.1877,0.13847218453884125
792
+ 7910,0.2013,0.20203669369220734
793
+ 7920,0.1843,0.11584684997797012
794
+ 7930,0.1803,0.43143773078918457
795
+ 7940,0.1875,0.2520279586315155
796
+ 7950,0.1784,0.10930690169334412
797
+ 7960,0.2071,0.16624777019023895
798
+ 7970,0.2001,0.14964212477207184
799
+ 7980,0.1924,0.15127533674240112
800
+ 7990,0.1942,0.16396445035934448
801
+ 8000,0.1906,0.1415846049785614
802
+ 8010,0.1786,0.16897065937519073
803
+ 8020,0.1856,0.26051029562950134
804
+ 8030,0.1868,0.26480433344841003
805
+ 8040,0.1898,0.19422131776809692
806
+ 8050,0.1954,0.522674024105072
807
+ 8060,0.1987,0.22055862843990326
808
+ 8070,0.1934,0.22084835171699524
809
+ 8080,0.1904,0.2467452585697174
810
+ 8090,0.2004,0.2391338348388672
811
+ 8100,0.1906,0.21444933116436005
812
+ 8110,0.1935,0.41174784302711487
813
+ 8120,0.1832,0.14398345351219177
814
+ 8130,0.1969,0.16611593961715698
815
+ 8140,0.185,0.24117395281791687
816
+ 8150,0.2064,0.16278576850891113
817
+ 8160,0.1912,0.22578538954257965
818
+ 8170,0.1852,0.11014354974031448
819
+ 8180,0.1838,0.15677596628665924
820
+ 8190,0.192,0.322907418012619
821
+ 8200,0.1888,0.15526893734931946
822
+ 8210,0.1988,0.719232976436615
823
+ 8220,0.1798,0.21567432582378387
824
+ 8230,0.2142,0.14568041265010834
825
+ 8240,0.19,0.10558228939771652
826
+ 8250,0.1997,0.4023348093032837
827
+ 8260,0.1963,0.16071440279483795
828
+ 8270,0.1951,0.1530512422323227
829
+ 8280,0.1947,0.2199752777814865
830
+ 8290,0.1897,0.1199599876999855
831
+ 8300,0.1929,0.19223779439926147
832
+ 8310,0.195,0.18949465453624725
833
+ 8320,0.1847,0.11131028085947037
834
+ 8330,0.1819,0.2341558188199997
835
+ 8340,0.1797,0.19725611805915833
836
+ 8350,0.1926,0.13978533446788788
837
+ 8360,0.1855,0.1504231095314026
838
+ 8370,0.1941,0.34350261092185974
839
+ 8380,0.1948,0.3222501873970032
840
+ 8390,0.1811,0.1929732710123062
841
+ 8400,0.1689,0.2489142268896103
842
+ 8410,0.2022,0.18194100260734558
843
+ 8420,0.2108,0.16286848485469818
844
+ 8430,0.1887,0.1424311250448227
845
+ 8440,0.1793,0.38083863258361816
846
+ 8450,0.1965,0.3764590919017792
847
+ 8460,0.1913,0.2941056787967682
848
+ 8470,0.1785,0.3669663965702057
849
+ 8480,0.1956,0.23171833157539368
850
+ 8490,0.1954,0.4153014123439789
851
+ 8500,0.1831,0.15466581284999847
852
+ 8510,0.1896,0.12290091812610626
853
+ 8520,0.1886,0.1719137281179428
854
+ 8530,0.1983,0.22029288113117218
855
+ 8540,0.1861,0.44206833839416504
856
+ 8550,0.1917,0.16690091788768768
857
+ 8560,0.19,0.18483784794807434
858
+ 8570,0.1814,0.18241430819034576
859
+ 8580,0.1911,0.17267461121082306
860
+ 8590,0.1958,0.17842534184455872
861
+ 8600,0.1786,0.23683862388134003
862
+ 8610,0.1871,0.14404167234897614
863
+ 8620,0.1966,0.19246040284633636
864
+ 8630,0.191,0.31685906648635864
865
+ 8640,0.1902,0.15502500534057617
866
+ 8650,0.189,0.1416531354188919
867
+ 8660,0.1959,0.1969127207994461
868
+ 8670,0.206,0.14259091019630432
869
+ 8680,0.1999,0.22570674121379852
870
+ 8690,0.1827,0.14562635123729706
871
+ 8700,0.1862,0.27321743965148926
872
+ 8710,0.188,0.382001131772995
873
+ 8720,0.1802,0.3076627850532532
874
+ 8730,0.1951,0.22753578424453735
875
+ 8740,0.1869,0.30095112323760986
876
+ 8750,0.1845,0.12934041023254395
877
+ 8760,0.1812,0.1389361023902893
878
+ 8770,0.19,0.23508881032466888
879
+ 8780,0.1979,0.3939878046512604
880
+ 8790,0.1938,0.18323196470737457
881
+ 8800,0.1762,0.12859997153282166
882
+ 8810,0.1887,0.09579361975193024
883
+ 8820,0.1826,0.24968503415584564
884
+ 8830,0.1815,0.3072535991668701
885
+ 8840,0.1906,0.20545276999473572
886
+ 8850,0.1887,0.23975001275539398
887
+ 8860,0.2031,0.2402612715959549
888
+ 8870,0.1801,0.3224344253540039
889
+ 8880,0.1964,0.12220064550638199
890
+ 8890,0.188,0.40204569697380066
891
+ 8900,0.1907,0.1302637755870819
892
+ 8910,0.1896,0.1664087474346161
893
+ 8920,0.2034,0.17688220739364624
894
+ 8930,0.2041,0.15294149518013
895
+ 8940,0.2064,0.10613101720809937
896
+ 8950,0.1719,0.259779691696167
897
+ 8960,0.1979,0.1018250435590744
898
+ 8970,0.1784,0.33674901723861694
899
+ 8980,0.1884,0.24169038236141205
900
+ 8990,0.1857,0.44149723649024963
901
+ 9000,0.1902,0.14126497507095337
902
+ 9010,0.192,0.16225175559520721
903
+ 9020,0.2049,0.19235830008983612
904
+ 9030,0.1927,0.2529277503490448
905
+ 9040,0.1854,0.12438632547855377
906
+ 9050,0.1882,0.15202412009239197
907
+ 9060,0.1898,0.2705388069152832
908
+ 9070,0.1946,0.1650957465171814
909
+ 9080,0.1957,0.23341186344623566
910
+ 9090,0.1754,0.16597703099250793
911
+ 9100,0.1795,0.2937939465045929
912
+ 9110,0.1939,0.32507333159446716
913
+ 9120,0.1926,0.2513993978500366
914
+ 9130,0.1886,0.11721107363700867
915
+ 9140,0.1808,0.24468494951725006
916
+ 9150,0.1896,0.31609785556793213
917
+ 9160,0.1898,0.20091809332370758
918
+ 9170,0.1941,0.13135437667369843
919
+ 9180,0.1936,0.31788596510887146
920
+ 9190,0.1807,0.27197974920272827
921
+ 9200,0.1746,0.15166611969470978
922
+ 9210,0.1888,0.19648128747940063
923
+ 9220,0.193,0.16385284066200256
924
+ 9230,0.1858,0.16671708226203918
925
+ 9240,0.2032,0.16677770018577576
926
+ 9250,0.1991,0.15752294659614563
927
+ 9260,0.182,0.13784664869308472
928
+ 9270,0.1861,0.10085441917181015
929
+ 9280,0.1755,0.28498342633247375
930
+ 9290,0.1899,0.1942908763885498
931
+ 9300,0.2113,0.3462575674057007
932
+ 9310,0.1783,0.17943081259727478
933
+ 9320,0.1887,0.34774503111839294
934
+ 9330,0.2008,0.2243794947862625
935
+ 9340,0.2006,0.33637240529060364
936
+ 9350,0.1931,0.13265320658683777
937
+ 9360,0.175,0.17303265631198883
938
+ 9370,0.1931,0.1601858139038086
939
+ 9375,nan,nan
vocab.json ADDED
The diff for this file is too large to render. See raw diff