shaddy43443 commited on
Commit
840e214
·
verified ·
1 Parent(s): cd2a44e

Initial commit of OlgunLLM

Browse files
.gitattributes CHANGED
@@ -33,3 +33,4 @@ saved_model/**/* filter=lfs diff=lfs merge=lfs -text
33
  *.zip filter=lfs diff=lfs merge=lfs -text
34
  *.zst filter=lfs diff=lfs merge=lfs -text
35
  *tfevents* filter=lfs diff=lfs merge=lfs -text
 
 
33
  *.zip filter=lfs diff=lfs merge=lfs -text
34
  *.zst filter=lfs diff=lfs merge=lfs -text
35
  *tfevents* filter=lfs diff=lfs merge=lfs -text
36
+ tokenizer.json filter=lfs diff=lfs merge=lfs -text
README.md ADDED
@@ -0,0 +1,336 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ tags:
4
+ - text-generation-inference
5
+ - finance
6
+ - economics
7
+ datasets:
8
+ - Josephgflowers/Finance-Instruct-500k
9
+ language:
10
+ - en
11
+ base_model:
12
+ - unsloth/Meta-Llama-3.1-8B
13
+ pipeline_tag: text-generation
14
+ library_name: transformers
15
+ ---
16
+
17
+ # Model Card for Finance-Llama-8B
18
+
19
+ This model is a fine-tuned version of `unsloth/Meta-Llama-3.1-8B` on the `Josephgflowers/Finance-Instruct-500k` dataset. It's designed for financial tasks, reasoning, and multi-turn conversations.
20
+
21
+ ## Key Features
22
+
23
+ * **Extensive Coverage:** Trained on over 500,000 entries spanning financial QA, reasoning, sentiment analysis, topic classification, multilingual NER, and conversational AI.📚
24
+ * **Multi-Turn Conversations:** Capable of rich dialogues emphasizing contextual understanding and reasoning.
25
+ * **Diverse Data Sources:** Includes entries from Cinder, Sujet-Finance-Instruct-177k, Phinance Dataset, BAAI/IndustryInstruction_Finance-Economics, Josephgflowers/Financial-NER-NLP, and many other high-quality datasets.
26
+ * **Financial Specialization:** Tailored for financial reasoning, question answering, entity recognition, sentiment analysis, and more.
27
+
28
+ ## Author & Opportunities
29
+ Tarun Sai Goddu - Data Scientist at Jio Platforms Ltd (2+ years experience) | IIT Bombay
30
+
31
+ Expertise: AI Agents • RAG Pipelines • Computer Vision • NLP • Speech Domain
32
+
33
+ **Actively seeking opportunities** as an **ML Engineer II / Data Scientist II** where I can contribute to building scalable, production-ready AI/ML systems.
34
+
35
+ **Reach me here:**
36
+ - LinkedIn: [linkedin.com/in/tarunsaigoddu](https://www.linkedin.com/in/tarunsaigoddu) | Email: [[email protected]](mailto:[email protected])
37
+ - GitHub: [github.com/tarun7r](https://github.com/tarun7r)
38
+
39
+
40
+ ## Dataset Details 💾
41
+
42
+ ### Finance-Instruct-500k Dataset
43
+
44
+ **Overview**
45
+ Finance-Instruct-500k is a comprehensive and meticulously curated dataset designed to train advanced language models for financial tasks, reasoning, and multi-turn conversations. Combining data from numerous high-quality financial datasets, this corpus provides over 500,000 entries, offering unparalleled depth and versatility for finance-related instruction tuning and fine-tuning.
46
+
47
+ The dataset includes content tailored for financial reasoning, question answering, entity recognition, sentiment analysis, address parsing, and multilingual natural language processing (NLP). Its diverse and deduplicated entries make it suitable for a wide range of financial AI applications, including domain-specific assistants, conversational agents, and information extraction systems.
48
+
49
+ **Key Features of the Dataset**
50
+ * **Extensive Coverage:** Over 500,000 entries spanning financial QA, reasoning, sentiment analysis, topic classification, multilingual NER, and conversational AI.🌍
51
+ * **Multi-Turn Conversations:** Rich dialogues emphasizing contextual understanding and reasoning.🗣️
52
+ * **Diverse Data Sources:** Includes entries from Cinder, Sujet-Finance-Instruct-177k, Phinance Dataset, BAAI/IndustryInstruction_Finance-Economics, Josephgflowers/Financial-NER-NLP, and many other high-quality datasets. 📖
53
+
54
+ ## CFA Level 1 Mock Exam Results
55
+
56
+ The CFA (Chartered Financial Analyst) exam is widely recognized as one of the most challenging professional certifications in the financial industry, typically requiring over 1000 hours of study across all three levels. The evaluation concept for the CFA Level 1 mock exam was inspired by the work on [mukaj/Llama-3.1-Hawkish-8B](https://huggingface.co/mukaj/Llama-3.1-Hawkish-8B). Below is a comparison of different models on a sample Level 1 CFA Mock Exam, demonstrating how Finance-Llama-8B performs on the exam. The same prompt was used for all models. The results presented are approximated and have been tested across multiple mock exam papers to ensure consistency. A sample mock exam with a comparison to other models is shown below.
57
+
58
+ <table>
59
+ <thead>
60
+ <tr>
61
+ <th>CFA Level 1</th>
62
+ <th>GPT-4o-mini (%)</th>
63
+ <th>Finance-Llama-8B (%)</th>
64
+ <th>Meta-Llama Instruct 8B (%)</th>
65
+ <th>Meta-Llama Instruct 70B (%)</th>
66
+ </tr>
67
+ </thead>
68
+ <tbody>
69
+ <tr>
70
+ <td>Ethical and Professional Standards</td>
71
+ <td>80</td>
72
+ <td>76</td>
73
+ <td>56</td>
74
+ <td>68</td>
75
+ </tr>
76
+ <tr>
77
+ <td>Quantitative Methods</td>
78
+ <td>74</td>
79
+ <td>73</td>
80
+ <td>64</td>
81
+ <td>85</td>
82
+ </tr>
83
+ <tr>
84
+ <td>Economics</td>
85
+ <td>69</td>
86
+ <td>74</td>
87
+ <td>59</td>
88
+ <td>59</td>
89
+ </tr>
90
+ <tr>
91
+ <td>Financial Reporting</td>
92
+ <td>81</td>
93
+ <td>77</td>
94
+ <td>67</td>
95
+ <td>71</td>
96
+ </tr>
97
+ <tr>
98
+ <td>Corporate Finance</td>
99
+ <td>82</td>
100
+ <td>71</td>
101
+ <td>51</td>
102
+ <td>80</td>
103
+ </tr>
104
+ <tr>
105
+ <td>Equity Investments</td>
106
+ <td>53</td>
107
+ <td>67</td>
108
+ <td>43</td>
109
+ <td>66</td>
110
+ </tr>
111
+ <tr>
112
+ <td>Fixed Income</td>
113
+ <td>80</td>
114
+ <td>72</td>
115
+ <td>29</td>
116
+ <td>51</td>
117
+ </tr>
118
+ <tr>
119
+ <td>Derivatives</td>
120
+ <td>54</td>
121
+ <td>72</td>
122
+ <td>34</td>
123
+ <td>35</td>
124
+ </tr>
125
+ <tr>
126
+ <td>Alternative Investments</td>
127
+ <td>100</td>
128
+ <td>89</td>
129
+ <td>74</td>
130
+ <td>100</td>
131
+ </tr>
132
+ <tr>
133
+ <td>Portfolio Management</td>
134
+ <td>85</td>
135
+ <td>75</td>
136
+ <td>52</td>
137
+ <td>100</td>
138
+ </tr>
139
+ <tr>
140
+ <td><b>Weighted Average</b></td>
141
+ <td><b>75</td>
142
+ <td><b>73</td>
143
+ <td>53</td>
144
+ <td><b>70</td>
145
+ </tr>
146
+ <tr>
147
+ <td><b>Result</b></td>
148
+ <td><b>PASS</b></td>
149
+ <td><b>PASS</b></td>
150
+ <td><b>FAIL</b></td>
151
+ <td><b>PASS</b></td>
152
+ </tr>
153
+ </tbody>
154
+ </table>
155
+
156
+ The mock exams are designed to challenge candidates with varying levels of difficulty, reflecting the rigorous nature of the CFA Level 1 exam. Pass rates for these mock exams typically range from 64% to 72%, with an average pass rate of around 67%. This average is notably higher than the 12-year average Minimum Passing Score (MPS) of 65% for all CFA years, indicating the effectiveness of the preparation materials. For more detailed insights, visit [300hours.com/cfa-passing-score](https://300hours.com/cfa-passing-score/).
157
+
158
+
159
+ ## Usage
160
+
161
+ ### Ollama
162
+
163
+ You can also use this model with Ollama. Pre-built GGUF versions (FP16 and Q4_K_M) are available at:
164
+ [ollama.com/martain7r/finance-llama-8b](https://ollama.com/martain7r/finance-llama-8b)
165
+
166
+ To run the FP16 version:
167
+ ```bash
168
+ ollama run martain7r/finance-llama-8b:fp16
169
+ ```
170
+
171
+ To run the Q4_K_M quantized version (smaller and faster, with a slight trade-off in quality):
172
+ ```bash
173
+ ollama run martain7r/finance-llama-8b:q4_k_m
174
+ ```
175
+
176
+ This model can be used with the `transformers` library pipeline for text generation.
177
+
178
+ First, make sure you have the `transformers` and `torch` libraries installed:
179
+
180
+ ````bash
181
+ pip install transformers torch
182
+ ````
183
+
184
+ **Usage 🚀**
185
+ **Transformers Pipeline**
186
+
187
+ ````bash
188
+ from transformers import pipeline, AutoModelForCausalLM, AutoTokenizer
189
+ import torch
190
+
191
+ # Alternative memory-efficient loading options without bitsandbytes
192
+
193
+ model_id = "tarun7r/Finance-Llama-8B"
194
+
195
+ print("Loading model with memory optimizations...")
196
+
197
+ # Option 1: Use FP16 (half precision) - reduces memory by ~50%
198
+ try:
199
+ print("Trying FP16 loading...")
200
+ model = AutoModelForCausalLM.from_pretrained(
201
+ model_id,
202
+ torch_dtype=torch.float16, # Half precision
203
+ device_map="auto", # Automatic device placement
204
+ low_cpu_mem_usage=True, # Efficient CPU memory usage during loading
205
+ trust_remote_code=True
206
+ )
207
+ print("✓ Model loaded with FP16")
208
+
209
+ except Exception as e:
210
+ print(f"FP16 loading failed: {e}")
211
+
212
+ # Option 2: CPU offloading - some layers on GPU, some on CPU
213
+ try:
214
+ print("Trying CPU offloading...")
215
+ model = AutoModelForCausalLM.from_pretrained(
216
+ model_id,
217
+ torch_dtype=torch.float16,
218
+ device_map="balanced", # Balance between GPU and CPU
219
+ low_cpu_mem_usage=True,
220
+ trust_remote_code=True
221
+ )
222
+ print("✓ Model loaded with CPU offloading")
223
+
224
+ except Exception as e:
225
+ print(f"CPU offloading failed: {e}")
226
+
227
+ # Option 3: Full CPU loading as fallback
228
+ print("Loading on CPU...")
229
+ model = AutoModelForCausalLM.from_pretrained(
230
+ model_id,
231
+ torch_dtype=torch.float16,
232
+ device_map="cpu",
233
+ low_cpu_mem_usage=True,
234
+ trust_remote_code=True
235
+ )
236
+ print("✓ Model loaded on CPU")
237
+
238
+ # Load tokenizer
239
+ tokenizer = AutoTokenizer.from_pretrained(model_id)
240
+ if tokenizer.pad_token is None:
241
+ tokenizer.pad_token = tokenizer.eos_token
242
+
243
+ # Create pipeline
244
+ generator = pipeline(
245
+ "text-generation",
246
+ model=model,
247
+ tokenizer=tokenizer
248
+ )
249
+
250
+ print("✓ Pipeline created successfully!")
251
+
252
+
253
+ # Your existing prompt code
254
+ finance_prompt_template = """Below is an instruction that describes a task, paired with an input that provides further context. Write a response that appropriately completes the request.
255
+
256
+ ### Instruction:
257
+ {}
258
+
259
+ ### Input:
260
+ {}
261
+
262
+ ### Response:
263
+ """
264
+
265
+ # Update the system prompt to provide a more detailed description of the chatbot's role
266
+ messages = [
267
+ {"role": "system", "content": "You are a highly knowledgeable finance chatbot. Your purpose is to provide accurate, insightful, and actionable financial advice to users, tailored to their specific needs and contexts."},
268
+ {"role": "user", "content": "What strategies can an individual investor use to diversify their portfolio effectively in a volatile market?"},
269
+ ]
270
+
271
+ # Update the generator call to use the messages
272
+ prompt = "\n".join([f"{msg['role'].capitalize()}: {msg['content']}" for msg in messages])
273
+
274
+ print("\n--- Generating Response ---")
275
+
276
+ try:
277
+ outputs = generator(
278
+ prompt,
279
+ #max_new_tokens=250, # Reduced for memory efficiency
280
+ do_sample=True,
281
+ temperature=0.7,
282
+ top_p=0.9,
283
+ pad_token_id=tokenizer.eos_token_id,
284
+ # Memory efficient generation settings
285
+ num_beams=1, # No beam search to save memory
286
+ early_stopping=True,
287
+ use_cache=True
288
+ )
289
+
290
+ # Extract response
291
+ generated_text = outputs[0]['generated_text']
292
+ response_start = generated_text.rfind("### Response:")
293
+ if response_start != -1:
294
+ response = generated_text[response_start + len("### Response:"):].strip()
295
+ print("\n--- Response ---")
296
+ print(response)
297
+ else:
298
+ print(generated_text)
299
+
300
+ # Clean up GPU memory after generation
301
+ if torch.cuda.is_available():
302
+ torch.cuda.empty_cache()
303
+
304
+ except Exception as e:
305
+ print(f"Generation error: {e}")
306
+
307
+
308
+ ````
309
+ **Citation 📌**
310
+ ````
311
+ @misc{tarun7r/Finance-Llama-8B,
312
+ author = {tarun7r},
313
+ title = {tarun7r/Finance-Llama-8B: A Llama 3.1 8B Model Fine-tuned on Josephgflowers/Finance-Instruct-500k},
314
+ year = {2025},
315
+ publisher = {Hugging Face},
316
+ journal = {Hugging Face Model Hub},
317
+ howpublished = {\url{https://huggingface.co/tarun7r/Finance-Llama-8B}}
318
+ }
319
+
320
+ ````
321
+
322
+ ## Disclaimer & Intended Uses
323
+
324
+ ### Model & License
325
+ This model is an experimental research implementation based on Meta's LLaMA 3.1 architecture and is governed by the LLaMA 3.1 community license terms, with additional restrictions as outlined below. It is designed for academic and research purposes to explore the influence of financial data in training language models. Users are advised that this model is experimental and should be used at their own risk, with full responsibility for any implementation or application. **This model is not a financial advisor and should not be used for financial decision-making.**
326
+
327
+ ### Liability & Responsibility
328
+ The creators of this model:
329
+ - Accept no responsibility for any use of the model, including any financial losses or damages incurred.
330
+ - Provide no warranties or guarantees regarding its performance, accuracy, or reliability.
331
+ - Make no claims about the suitability of the model for any specific purpose.
332
+
333
+ ### Intellectual Property & Attribution
334
+ - All findings and opinions expressed are solely those of the authors.
335
+ - This model is not endorsed by or affiliated with Meta, the CFA Institute, or any other institutions.
336
+ - All trademarks and intellectual property rights belong to their respective owners.
config.json ADDED
@@ -0,0 +1,40 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "architectures": [
3
+ "LlamaForCausalLM"
4
+ ],
5
+ "attention_bias": false,
6
+ "attention_dropout": 0.0,
7
+ "bos_token_id": 128000,
8
+ "eos_token_id": 128001,
9
+ "head_dim": 128,
10
+ "hidden_act": "silu",
11
+ "hidden_size": 4096,
12
+ "initializer_range": 0.02,
13
+ "intermediate_size": 14336,
14
+ "max_position_embeddings": 131072,
15
+ "mlp_bias": false,
16
+ "model_type": "llama",
17
+ "num_attention_heads": 32,
18
+ "num_hidden_layers": 32,
19
+ "num_key_value_heads": 8,
20
+ "pad_token_id": 128004,
21
+ "pretraining_tp": 1,
22
+ "rms_norm_eps": 1e-05,
23
+ "rope_scaling": {
24
+ "factor": 8.0,
25
+ "high_freq_factor": 4.0,
26
+ "low_freq_factor": 1.0,
27
+ "original_max_position_embeddings": 8192,
28
+ "rope_type": "llama3"
29
+ },
30
+ "rope_theta": 500000.0,
31
+ "tie_word_embeddings": false,
32
+ "torch_dtype": "float16",
33
+ "transformers_version": "4.51.3",
34
+ "unsloth_fixed": true,
35
+ "unsloth_version": "2025.5.7",
36
+ "use_cache": true,
37
+ "vocab_size": 128256,
38
+ "model_name": "OlgunLLM",
39
+ "description": "My renamed and fine-tuned finance model"
40
+ }
generation_config.json ADDED
@@ -0,0 +1,11 @@
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_from_model_config": true,
3
+ "bos_token_id": 128000,
4
+ "do_sample": true,
5
+ "eos_token_id": 128001,
6
+ "max_length": 131072,
7
+ "pad_token_id": 128004,
8
+ "temperature": 0.6,
9
+ "top_p": 0.9,
10
+ "transformers_version": "4.51.3"
11
+ }
model.safetensors.index.json ADDED
@@ -0,0 +1,298 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "metadata": {
3
+ "total_size": 16060522496
4
+ },
5
+ "weight_map": {
6
+ "model.embed_tokens.weight": "pytorch_model-00001-of-00004.safetensors",
7
+ "model.layers.0.input_layernorm.weight": "pytorch_model-00001-of-00004.safetensors",
8
+ "model.layers.0.mlp.down_proj.weight": "pytorch_model-00001-of-00004.safetensors",
9
+ "model.layers.0.mlp.gate_proj.weight": "pytorch_model-00001-of-00004.safetensors",
10
+ "model.layers.0.mlp.up_proj.weight": "pytorch_model-00001-of-00004.safetensors",
11
+ "model.layers.0.post_attention_layernorm.weight": "pytorch_model-00001-of-00004.safetensors",
12
+ "model.layers.0.self_attn.k_proj.weight": "pytorch_model-00001-of-00004.safetensors",
13
+ "model.layers.0.self_attn.o_proj.weight": "pytorch_model-00001-of-00004.safetensors",
14
+ "model.layers.0.self_attn.q_proj.weight": "pytorch_model-00001-of-00004.safetensors",
15
+ "model.layers.0.self_attn.v_proj.weight": "pytorch_model-00001-of-00004.safetensors",
16
+ "model.layers.1.input_layernorm.weight": "pytorch_model-00001-of-00004.safetensors",
17
+ "model.layers.1.mlp.down_proj.weight": "pytorch_model-00001-of-00004.safetensors",
18
+ "model.layers.1.mlp.gate_proj.weight": "pytorch_model-00001-of-00004.safetensors",
19
+ "model.layers.1.mlp.up_proj.weight": "pytorch_model-00001-of-00004.safetensors",
20
+ "model.layers.1.post_attention_layernorm.weight": "pytorch_model-00001-of-00004.safetensors",
21
+ "model.layers.1.self_attn.k_proj.weight": "pytorch_model-00001-of-00004.safetensors",
22
+ "model.layers.1.self_attn.o_proj.weight": "pytorch_model-00001-of-00004.safetensors",
23
+ "model.layers.1.self_attn.q_proj.weight": "pytorch_model-00001-of-00004.safetensors",
24
+ "model.layers.1.self_attn.v_proj.weight": "pytorch_model-00001-of-00004.safetensors",
25
+ "model.layers.2.input_layernorm.weight": "pytorch_model-00001-of-00004.safetensors",
26
+ "model.layers.2.mlp.down_proj.weight": "pytorch_model-00001-of-00004.safetensors",
27
+ "model.layers.2.mlp.gate_proj.weight": "pytorch_model-00001-of-00004.safetensors",
28
+ "model.layers.2.mlp.up_proj.weight": "pytorch_model-00001-of-00004.safetensors",
29
+ "model.layers.2.post_attention_layernorm.weight": "pytorch_model-00001-of-00004.safetensors",
30
+ "model.layers.2.self_attn.k_proj.weight": "pytorch_model-00001-of-00004.safetensors",
31
+ "model.layers.2.self_attn.o_proj.weight": "pytorch_model-00001-of-00004.safetensors",
32
+ "model.layers.2.self_attn.q_proj.weight": "pytorch_model-00001-of-00004.safetensors",
33
+ "model.layers.2.self_attn.v_proj.weight": "pytorch_model-00001-of-00004.safetensors",
34
+ "model.layers.3.input_layernorm.weight": "pytorch_model-00001-of-00004.safetensors",
35
+ "model.layers.3.mlp.down_proj.weight": "pytorch_model-00001-of-00004.safetensors",
36
+ "model.layers.3.mlp.gate_proj.weight": "pytorch_model-00001-of-00004.safetensors",
37
+ "model.layers.3.mlp.up_proj.weight": "pytorch_model-00001-of-00004.safetensors",
38
+ "model.layers.3.post_attention_layernorm.weight": "pytorch_model-00001-of-00004.safetensors",
39
+ "model.layers.3.self_attn.k_proj.weight": "pytorch_model-00001-of-00004.safetensors",
40
+ "model.layers.3.self_attn.o_proj.weight": "pytorch_model-00001-of-00004.safetensors",
41
+ "model.layers.3.self_attn.q_proj.weight": "pytorch_model-00001-of-00004.safetensors",
42
+ "model.layers.3.self_attn.v_proj.weight": "pytorch_model-00001-of-00004.safetensors",
43
+ "model.layers.4.input_layernorm.weight": "pytorch_model-00001-of-00004.safetensors",
44
+ "model.layers.4.mlp.down_proj.weight": "pytorch_model-00001-of-00004.safetensors",
45
+ "model.layers.4.mlp.gate_proj.weight": "pytorch_model-00001-of-00004.safetensors",
46
+ "model.layers.4.mlp.up_proj.weight": "pytorch_model-00001-of-00004.safetensors",
47
+ "model.layers.4.post_attention_layernorm.weight": "pytorch_model-00001-of-00004.safetensors",
48
+ "model.layers.4.self_attn.k_proj.weight": "pytorch_model-00001-of-00004.safetensors",
49
+ "model.layers.4.self_attn.o_proj.weight": "pytorch_model-00001-of-00004.safetensors",
50
+ "model.layers.4.self_attn.q_proj.weight": "pytorch_model-00001-of-00004.safetensors",
51
+ "model.layers.4.self_attn.v_proj.weight": "pytorch_model-00001-of-00004.safetensors",
52
+ "model.layers.5.input_layernorm.weight": "pytorch_model-00001-of-00004.safetensors",
53
+ "model.layers.5.mlp.down_proj.weight": "pytorch_model-00001-of-00004.safetensors",
54
+ "model.layers.5.mlp.gate_proj.weight": "pytorch_model-00001-of-00004.safetensors",
55
+ "model.layers.5.mlp.up_proj.weight": "pytorch_model-00001-of-00004.safetensors",
56
+ "model.layers.5.post_attention_layernorm.weight": "pytorch_model-00001-of-00004.safetensors",
57
+ "model.layers.5.self_attn.k_proj.weight": "pytorch_model-00001-of-00004.safetensors",
58
+ "model.layers.5.self_attn.o_proj.weight": "pytorch_model-00001-of-00004.safetensors",
59
+ "model.layers.5.self_attn.q_proj.weight": "pytorch_model-00001-of-00004.safetensors",
60
+ "model.layers.5.self_attn.v_proj.weight": "pytorch_model-00001-of-00004.safetensors",
61
+ "model.layers.6.input_layernorm.weight": "pytorch_model-00001-of-00004.safetensors",
62
+ "model.layers.6.mlp.down_proj.weight": "pytorch_model-00001-of-00004.safetensors",
63
+ "model.layers.6.mlp.gate_proj.weight": "pytorch_model-00001-of-00004.safetensors",
64
+ "model.layers.6.mlp.up_proj.weight": "pytorch_model-00001-of-00004.safetensors",
65
+ "model.layers.6.post_attention_layernorm.weight": "pytorch_model-00001-of-00004.safetensors",
66
+ "model.layers.6.self_attn.k_proj.weight": "pytorch_model-00001-of-00004.safetensors",
67
+ "model.layers.6.self_attn.o_proj.weight": "pytorch_model-00001-of-00004.safetensors",
68
+ "model.layers.6.self_attn.q_proj.weight": "pytorch_model-00001-of-00004.safetensors",
69
+ "model.layers.6.self_attn.v_proj.weight": "pytorch_model-00001-of-00004.safetensors",
70
+ "model.layers.7.input_layernorm.weight": "pytorch_model-00001-of-00004.safetensors",
71
+ "model.layers.7.mlp.down_proj.weight": "pytorch_model-00001-of-00004.safetensors",
72
+ "model.layers.7.mlp.gate_proj.weight": "pytorch_model-00001-of-00004.safetensors",
73
+ "model.layers.7.mlp.up_proj.weight": "pytorch_model-00001-of-00004.safetensors",
74
+ "model.layers.7.post_attention_layernorm.weight": "pytorch_model-00001-of-00004.safetensors",
75
+ "model.layers.7.self_attn.k_proj.weight": "pytorch_model-00001-of-00004.safetensors",
76
+ "model.layers.7.self_attn.o_proj.weight": "pytorch_model-00001-of-00004.safetensors",
77
+ "model.layers.7.self_attn.q_proj.weight": "pytorch_model-00001-of-00004.safetensors",
78
+ "model.layers.7.self_attn.v_proj.weight": "pytorch_model-00001-of-00004.safetensors",
79
+ "model.layers.8.input_layernorm.weight": "pytorch_model-00001-of-00004.safetensors",
80
+ "model.layers.8.mlp.down_proj.weight": "pytorch_model-00001-of-00004.safetensors",
81
+ "model.layers.8.mlp.gate_proj.weight": "pytorch_model-00001-of-00004.safetensors",
82
+ "model.layers.8.mlp.up_proj.weight": "pytorch_model-00001-of-00004.safetensors",
83
+ "model.layers.8.post_attention_layernorm.weight": "pytorch_model-00001-of-00004.safetensors",
84
+ "model.layers.8.self_attn.k_proj.weight": "pytorch_model-00001-of-00004.safetensors",
85
+ "model.layers.8.self_attn.o_proj.weight": "pytorch_model-00001-of-00004.safetensors",
86
+ "model.layers.8.self_attn.q_proj.weight": "pytorch_model-00001-of-00004.safetensors",
87
+ "model.layers.8.self_attn.v_proj.weight": "pytorch_model-00001-of-00004.safetensors",
88
+ "model.layers.10.input_layernorm.weight": "pytorch_model-00002-of-00004.safetensors",
89
+ "model.layers.10.mlp.down_proj.weight": "pytorch_model-00002-of-00004.safetensors",
90
+ "model.layers.10.mlp.gate_proj.weight": "pytorch_model-00002-of-00004.safetensors",
91
+ "model.layers.10.mlp.up_proj.weight": "pytorch_model-00002-of-00004.safetensors",
92
+ "model.layers.10.post_attention_layernorm.weight": "pytorch_model-00002-of-00004.safetensors",
93
+ "model.layers.10.self_attn.k_proj.weight": "pytorch_model-00002-of-00004.safetensors",
94
+ "model.layers.10.self_attn.o_proj.weight": "pytorch_model-00002-of-00004.safetensors",
95
+ "model.layers.10.self_attn.q_proj.weight": "pytorch_model-00002-of-00004.safetensors",
96
+ "model.layers.10.self_attn.v_proj.weight": "pytorch_model-00002-of-00004.safetensors",
97
+ "model.layers.11.input_layernorm.weight": "pytorch_model-00002-of-00004.safetensors",
98
+ "model.layers.11.mlp.down_proj.weight": "pytorch_model-00002-of-00004.safetensors",
99
+ "model.layers.11.mlp.gate_proj.weight": "pytorch_model-00002-of-00004.safetensors",
100
+ "model.layers.11.mlp.up_proj.weight": "pytorch_model-00002-of-00004.safetensors",
101
+ "model.layers.11.post_attention_layernorm.weight": "pytorch_model-00002-of-00004.safetensors",
102
+ "model.layers.11.self_attn.k_proj.weight": "pytorch_model-00002-of-00004.safetensors",
103
+ "model.layers.11.self_attn.o_proj.weight": "pytorch_model-00002-of-00004.safetensors",
104
+ "model.layers.11.self_attn.q_proj.weight": "pytorch_model-00002-of-00004.safetensors",
105
+ "model.layers.11.self_attn.v_proj.weight": "pytorch_model-00002-of-00004.safetensors",
106
+ "model.layers.12.input_layernorm.weight": "pytorch_model-00002-of-00004.safetensors",
107
+ "model.layers.12.mlp.down_proj.weight": "pytorch_model-00002-of-00004.safetensors",
108
+ "model.layers.12.mlp.gate_proj.weight": "pytorch_model-00002-of-00004.safetensors",
109
+ "model.layers.12.mlp.up_proj.weight": "pytorch_model-00002-of-00004.safetensors",
110
+ "model.layers.12.post_attention_layernorm.weight": "pytorch_model-00002-of-00004.safetensors",
111
+ "model.layers.12.self_attn.k_proj.weight": "pytorch_model-00002-of-00004.safetensors",
112
+ "model.layers.12.self_attn.o_proj.weight": "pytorch_model-00002-of-00004.safetensors",
113
+ "model.layers.12.self_attn.q_proj.weight": "pytorch_model-00002-of-00004.safetensors",
114
+ "model.layers.12.self_attn.v_proj.weight": "pytorch_model-00002-of-00004.safetensors",
115
+ "model.layers.13.input_layernorm.weight": "pytorch_model-00002-of-00004.safetensors",
116
+ "model.layers.13.mlp.down_proj.weight": "pytorch_model-00002-of-00004.safetensors",
117
+ "model.layers.13.mlp.gate_proj.weight": "pytorch_model-00002-of-00004.safetensors",
118
+ "model.layers.13.mlp.up_proj.weight": "pytorch_model-00002-of-00004.safetensors",
119
+ "model.layers.13.post_attention_layernorm.weight": "pytorch_model-00002-of-00004.safetensors",
120
+ "model.layers.13.self_attn.k_proj.weight": "pytorch_model-00002-of-00004.safetensors",
121
+ "model.layers.13.self_attn.o_proj.weight": "pytorch_model-00002-of-00004.safetensors",
122
+ "model.layers.13.self_attn.q_proj.weight": "pytorch_model-00002-of-00004.safetensors",
123
+ "model.layers.13.self_attn.v_proj.weight": "pytorch_model-00002-of-00004.safetensors",
124
+ "model.layers.14.input_layernorm.weight": "pytorch_model-00002-of-00004.safetensors",
125
+ "model.layers.14.mlp.down_proj.weight": "pytorch_model-00002-of-00004.safetensors",
126
+ "model.layers.14.mlp.gate_proj.weight": "pytorch_model-00002-of-00004.safetensors",
127
+ "model.layers.14.mlp.up_proj.weight": "pytorch_model-00002-of-00004.safetensors",
128
+ "model.layers.14.post_attention_layernorm.weight": "pytorch_model-00002-of-00004.safetensors",
129
+ "model.layers.14.self_attn.k_proj.weight": "pytorch_model-00002-of-00004.safetensors",
130
+ "model.layers.14.self_attn.o_proj.weight": "pytorch_model-00002-of-00004.safetensors",
131
+ "model.layers.14.self_attn.q_proj.weight": "pytorch_model-00002-of-00004.safetensors",
132
+ "model.layers.14.self_attn.v_proj.weight": "pytorch_model-00002-of-00004.safetensors",
133
+ "model.layers.15.input_layernorm.weight": "pytorch_model-00002-of-00004.safetensors",
134
+ "model.layers.15.mlp.down_proj.weight": "pytorch_model-00002-of-00004.safetensors",
135
+ "model.layers.15.mlp.gate_proj.weight": "pytorch_model-00002-of-00004.safetensors",
136
+ "model.layers.15.mlp.up_proj.weight": "pytorch_model-00002-of-00004.safetensors",
137
+ "model.layers.15.post_attention_layernorm.weight": "pytorch_model-00002-of-00004.safetensors",
138
+ "model.layers.15.self_attn.k_proj.weight": "pytorch_model-00002-of-00004.safetensors",
139
+ "model.layers.15.self_attn.o_proj.weight": "pytorch_model-00002-of-00004.safetensors",
140
+ "model.layers.15.self_attn.q_proj.weight": "pytorch_model-00002-of-00004.safetensors",
141
+ "model.layers.15.self_attn.v_proj.weight": "pytorch_model-00002-of-00004.safetensors",
142
+ "model.layers.16.input_layernorm.weight": "pytorch_model-00002-of-00004.safetensors",
143
+ "model.layers.16.mlp.down_proj.weight": "pytorch_model-00002-of-00004.safetensors",
144
+ "model.layers.16.mlp.gate_proj.weight": "pytorch_model-00002-of-00004.safetensors",
145
+ "model.layers.16.mlp.up_proj.weight": "pytorch_model-00002-of-00004.safetensors",
146
+ "model.layers.16.post_attention_layernorm.weight": "pytorch_model-00002-of-00004.safetensors",
147
+ "model.layers.16.self_attn.k_proj.weight": "pytorch_model-00002-of-00004.safetensors",
148
+ "model.layers.16.self_attn.o_proj.weight": "pytorch_model-00002-of-00004.safetensors",
149
+ "model.layers.16.self_attn.q_proj.weight": "pytorch_model-00002-of-00004.safetensors",
150
+ "model.layers.16.self_attn.v_proj.weight": "pytorch_model-00002-of-00004.safetensors",
151
+ "model.layers.17.input_layernorm.weight": "pytorch_model-00002-of-00004.safetensors",
152
+ "model.layers.17.mlp.down_proj.weight": "pytorch_model-00002-of-00004.safetensors",
153
+ "model.layers.17.mlp.gate_proj.weight": "pytorch_model-00002-of-00004.safetensors",
154
+ "model.layers.17.mlp.up_proj.weight": "pytorch_model-00002-of-00004.safetensors",
155
+ "model.layers.17.post_attention_layernorm.weight": "pytorch_model-00002-of-00004.safetensors",
156
+ "model.layers.17.self_attn.k_proj.weight": "pytorch_model-00002-of-00004.safetensors",
157
+ "model.layers.17.self_attn.o_proj.weight": "pytorch_model-00002-of-00004.safetensors",
158
+ "model.layers.17.self_attn.q_proj.weight": "pytorch_model-00002-of-00004.safetensors",
159
+ "model.layers.17.self_attn.v_proj.weight": "pytorch_model-00002-of-00004.safetensors",
160
+ "model.layers.18.input_layernorm.weight": "pytorch_model-00002-of-00004.safetensors",
161
+ "model.layers.18.mlp.down_proj.weight": "pytorch_model-00002-of-00004.safetensors",
162
+ "model.layers.18.mlp.gate_proj.weight": "pytorch_model-00002-of-00004.safetensors",
163
+ "model.layers.18.mlp.up_proj.weight": "pytorch_model-00002-of-00004.safetensors",
164
+ "model.layers.18.post_attention_layernorm.weight": "pytorch_model-00002-of-00004.safetensors",
165
+ "model.layers.18.self_attn.k_proj.weight": "pytorch_model-00002-of-00004.safetensors",
166
+ "model.layers.18.self_attn.o_proj.weight": "pytorch_model-00002-of-00004.safetensors",
167
+ "model.layers.18.self_attn.q_proj.weight": "pytorch_model-00002-of-00004.safetensors",
168
+ "model.layers.18.self_attn.v_proj.weight": "pytorch_model-00002-of-00004.safetensors",
169
+ "model.layers.19.input_layernorm.weight": "pytorch_model-00002-of-00004.safetensors",
170
+ "model.layers.19.mlp.down_proj.weight": "pytorch_model-00002-of-00004.safetensors",
171
+ "model.layers.19.mlp.gate_proj.weight": "pytorch_model-00002-of-00004.safetensors",
172
+ "model.layers.19.mlp.up_proj.weight": "pytorch_model-00002-of-00004.safetensors",
173
+ "model.layers.19.post_attention_layernorm.weight": "pytorch_model-00002-of-00004.safetensors",
174
+ "model.layers.19.self_attn.k_proj.weight": "pytorch_model-00002-of-00004.safetensors",
175
+ "model.layers.19.self_attn.o_proj.weight": "pytorch_model-00002-of-00004.safetensors",
176
+ "model.layers.19.self_attn.q_proj.weight": "pytorch_model-00002-of-00004.safetensors",
177
+ "model.layers.19.self_attn.v_proj.weight": "pytorch_model-00002-of-00004.safetensors",
178
+ "model.layers.20.mlp.gate_proj.weight": "pytorch_model-00002-of-00004.safetensors",
179
+ "model.layers.20.self_attn.k_proj.weight": "pytorch_model-00002-of-00004.safetensors",
180
+ "model.layers.20.self_attn.o_proj.weight": "pytorch_model-00002-of-00004.safetensors",
181
+ "model.layers.20.self_attn.q_proj.weight": "pytorch_model-00002-of-00004.safetensors",
182
+ "model.layers.20.self_attn.v_proj.weight": "pytorch_model-00002-of-00004.safetensors",
183
+ "model.layers.9.input_layernorm.weight": "pytorch_model-00002-of-00004.safetensors",
184
+ "model.layers.9.mlp.down_proj.weight": "pytorch_model-00002-of-00004.safetensors",
185
+ "model.layers.9.mlp.gate_proj.weight": "pytorch_model-00002-of-00004.safetensors",
186
+ "model.layers.9.mlp.up_proj.weight": "pytorch_model-00002-of-00004.safetensors",
187
+ "model.layers.9.post_attention_layernorm.weight": "pytorch_model-00002-of-00004.safetensors",
188
+ "model.layers.9.self_attn.k_proj.weight": "pytorch_model-00002-of-00004.safetensors",
189
+ "model.layers.9.self_attn.o_proj.weight": "pytorch_model-00002-of-00004.safetensors",
190
+ "model.layers.9.self_attn.q_proj.weight": "pytorch_model-00002-of-00004.safetensors",
191
+ "model.layers.9.self_attn.v_proj.weight": "pytorch_model-00002-of-00004.safetensors",
192
+ "model.layers.20.input_layernorm.weight": "pytorch_model-00003-of-00004.safetensors",
193
+ "model.layers.20.mlp.down_proj.weight": "pytorch_model-00003-of-00004.safetensors",
194
+ "model.layers.20.mlp.up_proj.weight": "pytorch_model-00003-of-00004.safetensors",
195
+ "model.layers.20.post_attention_layernorm.weight": "pytorch_model-00003-of-00004.safetensors",
196
+ "model.layers.21.input_layernorm.weight": "pytorch_model-00003-of-00004.safetensors",
197
+ "model.layers.21.mlp.down_proj.weight": "pytorch_model-00003-of-00004.safetensors",
198
+ "model.layers.21.mlp.gate_proj.weight": "pytorch_model-00003-of-00004.safetensors",
199
+ "model.layers.21.mlp.up_proj.weight": "pytorch_model-00003-of-00004.safetensors",
200
+ "model.layers.21.post_attention_layernorm.weight": "pytorch_model-00003-of-00004.safetensors",
201
+ "model.layers.21.self_attn.k_proj.weight": "pytorch_model-00003-of-00004.safetensors",
202
+ "model.layers.21.self_attn.o_proj.weight": "pytorch_model-00003-of-00004.safetensors",
203
+ "model.layers.21.self_attn.q_proj.weight": "pytorch_model-00003-of-00004.safetensors",
204
+ "model.layers.21.self_attn.v_proj.weight": "pytorch_model-00003-of-00004.safetensors",
205
+ "model.layers.22.input_layernorm.weight": "pytorch_model-00003-of-00004.safetensors",
206
+ "model.layers.22.mlp.down_proj.weight": "pytorch_model-00003-of-00004.safetensors",
207
+ "model.layers.22.mlp.gate_proj.weight": "pytorch_model-00003-of-00004.safetensors",
208
+ "model.layers.22.mlp.up_proj.weight": "pytorch_model-00003-of-00004.safetensors",
209
+ "model.layers.22.post_attention_layernorm.weight": "pytorch_model-00003-of-00004.safetensors",
210
+ "model.layers.22.self_attn.k_proj.weight": "pytorch_model-00003-of-00004.safetensors",
211
+ "model.layers.22.self_attn.o_proj.weight": "pytorch_model-00003-of-00004.safetensors",
212
+ "model.layers.22.self_attn.q_proj.weight": "pytorch_model-00003-of-00004.safetensors",
213
+ "model.layers.22.self_attn.v_proj.weight": "pytorch_model-00003-of-00004.safetensors",
214
+ "model.layers.23.input_layernorm.weight": "pytorch_model-00003-of-00004.safetensors",
215
+ "model.layers.23.mlp.down_proj.weight": "pytorch_model-00003-of-00004.safetensors",
216
+ "model.layers.23.mlp.gate_proj.weight": "pytorch_model-00003-of-00004.safetensors",
217
+ "model.layers.23.mlp.up_proj.weight": "pytorch_model-00003-of-00004.safetensors",
218
+ "model.layers.23.post_attention_layernorm.weight": "pytorch_model-00003-of-00004.safetensors",
219
+ "model.layers.23.self_attn.k_proj.weight": "pytorch_model-00003-of-00004.safetensors",
220
+ "model.layers.23.self_attn.o_proj.weight": "pytorch_model-00003-of-00004.safetensors",
221
+ "model.layers.23.self_attn.q_proj.weight": "pytorch_model-00003-of-00004.safetensors",
222
+ "model.layers.23.self_attn.v_proj.weight": "pytorch_model-00003-of-00004.safetensors",
223
+ "model.layers.24.input_layernorm.weight": "pytorch_model-00003-of-00004.safetensors",
224
+ "model.layers.24.mlp.down_proj.weight": "pytorch_model-00003-of-00004.safetensors",
225
+ "model.layers.24.mlp.gate_proj.weight": "pytorch_model-00003-of-00004.safetensors",
226
+ "model.layers.24.mlp.up_proj.weight": "pytorch_model-00003-of-00004.safetensors",
227
+ "model.layers.24.post_attention_layernorm.weight": "pytorch_model-00003-of-00004.safetensors",
228
+ "model.layers.24.self_attn.k_proj.weight": "pytorch_model-00003-of-00004.safetensors",
229
+ "model.layers.24.self_attn.o_proj.weight": "pytorch_model-00003-of-00004.safetensors",
230
+ "model.layers.24.self_attn.q_proj.weight": "pytorch_model-00003-of-00004.safetensors",
231
+ "model.layers.24.self_attn.v_proj.weight": "pytorch_model-00003-of-00004.safetensors",
232
+ "model.layers.25.input_layernorm.weight": "pytorch_model-00003-of-00004.safetensors",
233
+ "model.layers.25.mlp.down_proj.weight": "pytorch_model-00003-of-00004.safetensors",
234
+ "model.layers.25.mlp.gate_proj.weight": "pytorch_model-00003-of-00004.safetensors",
235
+ "model.layers.25.mlp.up_proj.weight": "pytorch_model-00003-of-00004.safetensors",
236
+ "model.layers.25.post_attention_layernorm.weight": "pytorch_model-00003-of-00004.safetensors",
237
+ "model.layers.25.self_attn.k_proj.weight": "pytorch_model-00003-of-00004.safetensors",
238
+ "model.layers.25.self_attn.o_proj.weight": "pytorch_model-00003-of-00004.safetensors",
239
+ "model.layers.25.self_attn.q_proj.weight": "pytorch_model-00003-of-00004.safetensors",
240
+ "model.layers.25.self_attn.v_proj.weight": "pytorch_model-00003-of-00004.safetensors",
241
+ "model.layers.26.input_layernorm.weight": "pytorch_model-00003-of-00004.safetensors",
242
+ "model.layers.26.mlp.down_proj.weight": "pytorch_model-00003-of-00004.safetensors",
243
+ "model.layers.26.mlp.gate_proj.weight": "pytorch_model-00003-of-00004.safetensors",
244
+ "model.layers.26.mlp.up_proj.weight": "pytorch_model-00003-of-00004.safetensors",
245
+ "model.layers.26.post_attention_layernorm.weight": "pytorch_model-00003-of-00004.safetensors",
246
+ "model.layers.26.self_attn.k_proj.weight": "pytorch_model-00003-of-00004.safetensors",
247
+ "model.layers.26.self_attn.o_proj.weight": "pytorch_model-00003-of-00004.safetensors",
248
+ "model.layers.26.self_attn.q_proj.weight": "pytorch_model-00003-of-00004.safetensors",
249
+ "model.layers.26.self_attn.v_proj.weight": "pytorch_model-00003-of-00004.safetensors",
250
+ "model.layers.27.input_layernorm.weight": "pytorch_model-00003-of-00004.safetensors",
251
+ "model.layers.27.mlp.down_proj.weight": "pytorch_model-00003-of-00004.safetensors",
252
+ "model.layers.27.mlp.gate_proj.weight": "pytorch_model-00003-of-00004.safetensors",
253
+ "model.layers.27.mlp.up_proj.weight": "pytorch_model-00003-of-00004.safetensors",
254
+ "model.layers.27.post_attention_layernorm.weight": "pytorch_model-00003-of-00004.safetensors",
255
+ "model.layers.27.self_attn.k_proj.weight": "pytorch_model-00003-of-00004.safetensors",
256
+ "model.layers.27.self_attn.o_proj.weight": "pytorch_model-00003-of-00004.safetensors",
257
+ "model.layers.27.self_attn.q_proj.weight": "pytorch_model-00003-of-00004.safetensors",
258
+ "model.layers.27.self_attn.v_proj.weight": "pytorch_model-00003-of-00004.safetensors",
259
+ "model.layers.28.input_layernorm.weight": "pytorch_model-00003-of-00004.safetensors",
260
+ "model.layers.28.mlp.down_proj.weight": "pytorch_model-00003-of-00004.safetensors",
261
+ "model.layers.28.mlp.gate_proj.weight": "pytorch_model-00003-of-00004.safetensors",
262
+ "model.layers.28.mlp.up_proj.weight": "pytorch_model-00003-of-00004.safetensors",
263
+ "model.layers.28.post_attention_layernorm.weight": "pytorch_model-00003-of-00004.safetensors",
264
+ "model.layers.28.self_attn.k_proj.weight": "pytorch_model-00003-of-00004.safetensors",
265
+ "model.layers.28.self_attn.o_proj.weight": "pytorch_model-00003-of-00004.safetensors",
266
+ "model.layers.28.self_attn.q_proj.weight": "pytorch_model-00003-of-00004.safetensors",
267
+ "model.layers.28.self_attn.v_proj.weight": "pytorch_model-00003-of-00004.safetensors",
268
+ "model.layers.29.input_layernorm.weight": "pytorch_model-00003-of-00004.safetensors",
269
+ "model.layers.29.mlp.down_proj.weight": "pytorch_model-00003-of-00004.safetensors",
270
+ "model.layers.29.mlp.gate_proj.weight": "pytorch_model-00003-of-00004.safetensors",
271
+ "model.layers.29.mlp.up_proj.weight": "pytorch_model-00003-of-00004.safetensors",
272
+ "model.layers.29.post_attention_layernorm.weight": "pytorch_model-00003-of-00004.safetensors",
273
+ "model.layers.29.self_attn.k_proj.weight": "pytorch_model-00003-of-00004.safetensors",
274
+ "model.layers.29.self_attn.o_proj.weight": "pytorch_model-00003-of-00004.safetensors",
275
+ "model.layers.29.self_attn.q_proj.weight": "pytorch_model-00003-of-00004.safetensors",
276
+ "model.layers.29.self_attn.v_proj.weight": "pytorch_model-00003-of-00004.safetensors",
277
+ "model.layers.30.input_layernorm.weight": "pytorch_model-00003-of-00004.safetensors",
278
+ "model.layers.30.mlp.down_proj.weight": "pytorch_model-00003-of-00004.safetensors",
279
+ "model.layers.30.mlp.gate_proj.weight": "pytorch_model-00003-of-00004.safetensors",
280
+ "model.layers.30.mlp.up_proj.weight": "pytorch_model-00003-of-00004.safetensors",
281
+ "model.layers.30.post_attention_layernorm.weight": "pytorch_model-00003-of-00004.safetensors",
282
+ "model.layers.30.self_attn.k_proj.weight": "pytorch_model-00003-of-00004.safetensors",
283
+ "model.layers.30.self_attn.o_proj.weight": "pytorch_model-00003-of-00004.safetensors",
284
+ "model.layers.30.self_attn.q_proj.weight": "pytorch_model-00003-of-00004.safetensors",
285
+ "model.layers.30.self_attn.v_proj.weight": "pytorch_model-00003-of-00004.safetensors",
286
+ "model.layers.31.mlp.gate_proj.weight": "pytorch_model-00003-of-00004.safetensors",
287
+ "model.layers.31.mlp.up_proj.weight": "pytorch_model-00003-of-00004.safetensors",
288
+ "model.layers.31.self_attn.k_proj.weight": "pytorch_model-00003-of-00004.safetensors",
289
+ "model.layers.31.self_attn.o_proj.weight": "pytorch_model-00003-of-00004.safetensors",
290
+ "model.layers.31.self_attn.q_proj.weight": "pytorch_model-00003-of-00004.safetensors",
291
+ "model.layers.31.self_attn.v_proj.weight": "pytorch_model-00003-of-00004.safetensors",
292
+ "lm_head.weight": "pytorch_model-00004-of-00004.safetensors",
293
+ "model.layers.31.input_layernorm.weight": "pytorch_model-00004-of-00004.safetensors",
294
+ "model.layers.31.mlp.down_proj.weight": "pytorch_model-00004-of-00004.safetensors",
295
+ "model.layers.31.post_attention_layernorm.weight": "pytorch_model-00004-of-00004.safetensors",
296
+ "model.norm.weight": "pytorch_model-00004-of-00004.safetensors"
297
+ }
298
+ }
pytorch_model-00001-of-00004.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:dd958bceca742b63988e37e4a5d3d1a021ea45836ba893ae1ca1108942d31422
3
+ size 4976698592
pytorch_model-00002-of-00004.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:45f4aeb19a1b9bc93161a0c887e5793c7528d29d5744fd6e7a3bdbdf8a6b1692
3
+ size 4999802616
pytorch_model-00003-of-00004.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:89cb5929bf924cd3936409a97a0e1071eace8b4976a7355cfe821068eea8d966
3
+ size 4915916080
pytorch_model-00004-of-00004.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:b665ba94a043aaa20672dd38661974ae23af59db44e4c306fd0e0fce8d44557c
3
+ size 1168138808
safe.py ADDED
@@ -0,0 +1,44 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import json
2
+ from safetensors import safe_open
3
+
4
+ def generate_safetensors_index(model_path="."):
5
+ """Generate model.safetensors.index.json from existing safetensors files"""
6
+
7
+ # Load the existing bin index as reference
8
+ with open(f"pytorch_model.bin.index.json", "r") as f:
9
+ bin_index = json.load(f)
10
+
11
+ # Initialize the safetensors index structure
12
+ safetensors_index = {
13
+ "metadata": bin_index.get("metadata", {}),
14
+ "weight_map": {}
15
+ }
16
+
17
+ # Map each safetensors file and get its tensor names
18
+ safetensors_files = [
19
+ "pytorch_model-00001-of-00004.safetensors",
20
+ "pytorch_model-00002-of-00004.safetensors",
21
+ "pytorch_model-00003-of-00004.safetensors",
22
+ "pytorch_model-00004-of-00004.safetensors"
23
+ ]
24
+
25
+ for safetensor_file in safetensors_files:
26
+ try:
27
+ with safe_open(f"{safetensor_file}", framework="pt") as f:
28
+ for tensor_name in f.keys():
29
+ safetensors_index["weight_map"][tensor_name] = safetensor_file
30
+ print(f"✓ Processed {safetensor_file}")
31
+ except Exception as e:
32
+ print(f"✗ Error processing {safetensor_file}: {e}")
33
+
34
+ # Save the index file
35
+ with open(f"model.safetensors.index.json", "w") as f:
36
+ json.dump(safetensors_index, f, indent=2)
37
+
38
+ print(f"✓ Generated model.safetensors.index.json with {len(safetensors_index['weight_map'])} tensors")
39
+ return safetensors_index
40
+
41
+ # Run the function
42
+ if __name__ == "__main__":
43
+ # Change this path to your model directory if needed
44
+ generate_safetensors_index("./Finance-Llama-8B")
special_tokens_map.json ADDED
@@ -0,0 +1,23 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "bos_token": {
3
+ "content": "<|begin_of_text|>",
4
+ "lstrip": false,
5
+ "normalized": false,
6
+ "rstrip": false,
7
+ "single_word": false
8
+ },
9
+ "eos_token": {
10
+ "content": "<|end_of_text|>",
11
+ "lstrip": false,
12
+ "normalized": false,
13
+ "rstrip": false,
14
+ "single_word": false
15
+ },
16
+ "pad_token": {
17
+ "content": "<|finetune_right_pad_id|>",
18
+ "lstrip": false,
19
+ "normalized": false,
20
+ "rstrip": false,
21
+ "single_word": false
22
+ }
23
+ }
tokenizer.json ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:6b9e4e7fb171f92fd137b777cc2714bf87d11576700a1dcd7a399e7bbe39537b
3
+ size 17209920
tokenizer_config.json ADDED
@@ -0,0 +1,2066 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "add_bos_token": true,
3
+ "added_tokens_decoder": {
4
+ "128000": {
5
+ "content": "<|begin_of_text|>",
6
+ "lstrip": false,
7
+ "normalized": false,
8
+ "rstrip": false,
9
+ "single_word": false,
10
+ "special": true
11
+ },
12
+ "128001": {
13
+ "content": "<|end_of_text|>",
14
+ "lstrip": false,
15
+ "normalized": false,
16
+ "rstrip": false,
17
+ "single_word": false,
18
+ "special": true
19
+ },
20
+ "128002": {
21
+ "content": "<|reserved_special_token_0|>",
22
+ "lstrip": false,
23
+ "normalized": false,
24
+ "rstrip": false,
25
+ "single_word": false,
26
+ "special": true
27
+ },
28
+ "128003": {
29
+ "content": "<|reserved_special_token_1|>",
30
+ "lstrip": false,
31
+ "normalized": false,
32
+ "rstrip": false,
33
+ "single_word": false,
34
+ "special": true
35
+ },
36
+ "128004": {
37
+ "content": "<|finetune_right_pad_id|>",
38
+ "lstrip": false,
39
+ "normalized": false,
40
+ "rstrip": false,
41
+ "single_word": false,
42
+ "special": true
43
+ },
44
+ "128005": {
45
+ "content": "<|reserved_special_token_2|>",
46
+ "lstrip": false,
47
+ "normalized": false,
48
+ "rstrip": false,
49
+ "single_word": false,
50
+ "special": true
51
+ },
52
+ "128006": {
53
+ "content": "<|start_header_id|>",
54
+ "lstrip": false,
55
+ "normalized": false,
56
+ "rstrip": false,
57
+ "single_word": false,
58
+ "special": true
59
+ },
60
+ "128007": {
61
+ "content": "<|end_header_id|>",
62
+ "lstrip": false,
63
+ "normalized": false,
64
+ "rstrip": false,
65
+ "single_word": false,
66
+ "special": true
67
+ },
68
+ "128008": {
69
+ "content": "<|eom_id|>",
70
+ "lstrip": false,
71
+ "normalized": false,
72
+ "rstrip": false,
73
+ "single_word": false,
74
+ "special": true
75
+ },
76
+ "128009": {
77
+ "content": "<|eot_id|>",
78
+ "lstrip": false,
79
+ "normalized": false,
80
+ "rstrip": false,
81
+ "single_word": false,
82
+ "special": true
83
+ },
84
+ "128010": {
85
+ "content": "<|python_tag|>",
86
+ "lstrip": false,
87
+ "normalized": false,
88
+ "rstrip": false,
89
+ "single_word": false,
90
+ "special": true
91
+ },
92
+ "128011": {
93
+ "content": "<|reserved_special_token_3|>",
94
+ "lstrip": false,
95
+ "normalized": false,
96
+ "rstrip": false,
97
+ "single_word": false,
98
+ "special": true
99
+ },
100
+ "128012": {
101
+ "content": "<|reserved_special_token_4|>",
102
+ "lstrip": false,
103
+ "normalized": false,
104
+ "rstrip": false,
105
+ "single_word": false,
106
+ "special": true
107
+ },
108
+ "128013": {
109
+ "content": "<|reserved_special_token_5|>",
110
+ "lstrip": false,
111
+ "normalized": false,
112
+ "rstrip": false,
113
+ "single_word": false,
114
+ "special": true
115
+ },
116
+ "128014": {
117
+ "content": "<|reserved_special_token_6|>",
118
+ "lstrip": false,
119
+ "normalized": false,
120
+ "rstrip": false,
121
+ "single_word": false,
122
+ "special": true
123
+ },
124
+ "128015": {
125
+ "content": "<|reserved_special_token_7|>",
126
+ "lstrip": false,
127
+ "normalized": false,
128
+ "rstrip": false,
129
+ "single_word": false,
130
+ "special": true
131
+ },
132
+ "128016": {
133
+ "content": "<|reserved_special_token_8|>",
134
+ "lstrip": false,
135
+ "normalized": false,
136
+ "rstrip": false,
137
+ "single_word": false,
138
+ "special": true
139
+ },
140
+ "128017": {
141
+ "content": "<|reserved_special_token_9|>",
142
+ "lstrip": false,
143
+ "normalized": false,
144
+ "rstrip": false,
145
+ "single_word": false,
146
+ "special": true
147
+ },
148
+ "128018": {
149
+ "content": "<|reserved_special_token_10|>",
150
+ "lstrip": false,
151
+ "normalized": false,
152
+ "rstrip": false,
153
+ "single_word": false,
154
+ "special": true
155
+ },
156
+ "128019": {
157
+ "content": "<|reserved_special_token_11|>",
158
+ "lstrip": false,
159
+ "normalized": false,
160
+ "rstrip": false,
161
+ "single_word": false,
162
+ "special": true
163
+ },
164
+ "128020": {
165
+ "content": "<|reserved_special_token_12|>",
166
+ "lstrip": false,
167
+ "normalized": false,
168
+ "rstrip": false,
169
+ "single_word": false,
170
+ "special": true
171
+ },
172
+ "128021": {
173
+ "content": "<|reserved_special_token_13|>",
174
+ "lstrip": false,
175
+ "normalized": false,
176
+ "rstrip": false,
177
+ "single_word": false,
178
+ "special": true
179
+ },
180
+ "128022": {
181
+ "content": "<|reserved_special_token_14|>",
182
+ "lstrip": false,
183
+ "normalized": false,
184
+ "rstrip": false,
185
+ "single_word": false,
186
+ "special": true
187
+ },
188
+ "128023": {
189
+ "content": "<|reserved_special_token_15|>",
190
+ "lstrip": false,
191
+ "normalized": false,
192
+ "rstrip": false,
193
+ "single_word": false,
194
+ "special": true
195
+ },
196
+ "128024": {
197
+ "content": "<|reserved_special_token_16|>",
198
+ "lstrip": false,
199
+ "normalized": false,
200
+ "rstrip": false,
201
+ "single_word": false,
202
+ "special": true
203
+ },
204
+ "128025": {
205
+ "content": "<|reserved_special_token_17|>",
206
+ "lstrip": false,
207
+ "normalized": false,
208
+ "rstrip": false,
209
+ "single_word": false,
210
+ "special": true
211
+ },
212
+ "128026": {
213
+ "content": "<|reserved_special_token_18|>",
214
+ "lstrip": false,
215
+ "normalized": false,
216
+ "rstrip": false,
217
+ "single_word": false,
218
+ "special": true
219
+ },
220
+ "128027": {
221
+ "content": "<|reserved_special_token_19|>",
222
+ "lstrip": false,
223
+ "normalized": false,
224
+ "rstrip": false,
225
+ "single_word": false,
226
+ "special": true
227
+ },
228
+ "128028": {
229
+ "content": "<|reserved_special_token_20|>",
230
+ "lstrip": false,
231
+ "normalized": false,
232
+ "rstrip": false,
233
+ "single_word": false,
234
+ "special": true
235
+ },
236
+ "128029": {
237
+ "content": "<|reserved_special_token_21|>",
238
+ "lstrip": false,
239
+ "normalized": false,
240
+ "rstrip": false,
241
+ "single_word": false,
242
+ "special": true
243
+ },
244
+ "128030": {
245
+ "content": "<|reserved_special_token_22|>",
246
+ "lstrip": false,
247
+ "normalized": false,
248
+ "rstrip": false,
249
+ "single_word": false,
250
+ "special": true
251
+ },
252
+ "128031": {
253
+ "content": "<|reserved_special_token_23|>",
254
+ "lstrip": false,
255
+ "normalized": false,
256
+ "rstrip": false,
257
+ "single_word": false,
258
+ "special": true
259
+ },
260
+ "128032": {
261
+ "content": "<|reserved_special_token_24|>",
262
+ "lstrip": false,
263
+ "normalized": false,
264
+ "rstrip": false,
265
+ "single_word": false,
266
+ "special": true
267
+ },
268
+ "128033": {
269
+ "content": "<|reserved_special_token_25|>",
270
+ "lstrip": false,
271
+ "normalized": false,
272
+ "rstrip": false,
273
+ "single_word": false,
274
+ "special": true
275
+ },
276
+ "128034": {
277
+ "content": "<|reserved_special_token_26|>",
278
+ "lstrip": false,
279
+ "normalized": false,
280
+ "rstrip": false,
281
+ "single_word": false,
282
+ "special": true
283
+ },
284
+ "128035": {
285
+ "content": "<|reserved_special_token_27|>",
286
+ "lstrip": false,
287
+ "normalized": false,
288
+ "rstrip": false,
289
+ "single_word": false,
290
+ "special": true
291
+ },
292
+ "128036": {
293
+ "content": "<|reserved_special_token_28|>",
294
+ "lstrip": false,
295
+ "normalized": false,
296
+ "rstrip": false,
297
+ "single_word": false,
298
+ "special": true
299
+ },
300
+ "128037": {
301
+ "content": "<|reserved_special_token_29|>",
302
+ "lstrip": false,
303
+ "normalized": false,
304
+ "rstrip": false,
305
+ "single_word": false,
306
+ "special": true
307
+ },
308
+ "128038": {
309
+ "content": "<|reserved_special_token_30|>",
310
+ "lstrip": false,
311
+ "normalized": false,
312
+ "rstrip": false,
313
+ "single_word": false,
314
+ "special": true
315
+ },
316
+ "128039": {
317
+ "content": "<|reserved_special_token_31|>",
318
+ "lstrip": false,
319
+ "normalized": false,
320
+ "rstrip": false,
321
+ "single_word": false,
322
+ "special": true
323
+ },
324
+ "128040": {
325
+ "content": "<|reserved_special_token_32|>",
326
+ "lstrip": false,
327
+ "normalized": false,
328
+ "rstrip": false,
329
+ "single_word": false,
330
+ "special": true
331
+ },
332
+ "128041": {
333
+ "content": "<|reserved_special_token_33|>",
334
+ "lstrip": false,
335
+ "normalized": false,
336
+ "rstrip": false,
337
+ "single_word": false,
338
+ "special": true
339
+ },
340
+ "128042": {
341
+ "content": "<|reserved_special_token_34|>",
342
+ "lstrip": false,
343
+ "normalized": false,
344
+ "rstrip": false,
345
+ "single_word": false,
346
+ "special": true
347
+ },
348
+ "128043": {
349
+ "content": "<|reserved_special_token_35|>",
350
+ "lstrip": false,
351
+ "normalized": false,
352
+ "rstrip": false,
353
+ "single_word": false,
354
+ "special": true
355
+ },
356
+ "128044": {
357
+ "content": "<|reserved_special_token_36|>",
358
+ "lstrip": false,
359
+ "normalized": false,
360
+ "rstrip": false,
361
+ "single_word": false,
362
+ "special": true
363
+ },
364
+ "128045": {
365
+ "content": "<|reserved_special_token_37|>",
366
+ "lstrip": false,
367
+ "normalized": false,
368
+ "rstrip": false,
369
+ "single_word": false,
370
+ "special": true
371
+ },
372
+ "128046": {
373
+ "content": "<|reserved_special_token_38|>",
374
+ "lstrip": false,
375
+ "normalized": false,
376
+ "rstrip": false,
377
+ "single_word": false,
378
+ "special": true
379
+ },
380
+ "128047": {
381
+ "content": "<|reserved_special_token_39|>",
382
+ "lstrip": false,
383
+ "normalized": false,
384
+ "rstrip": false,
385
+ "single_word": false,
386
+ "special": true
387
+ },
388
+ "128048": {
389
+ "content": "<|reserved_special_token_40|>",
390
+ "lstrip": false,
391
+ "normalized": false,
392
+ "rstrip": false,
393
+ "single_word": false,
394
+ "special": true
395
+ },
396
+ "128049": {
397
+ "content": "<|reserved_special_token_41|>",
398
+ "lstrip": false,
399
+ "normalized": false,
400
+ "rstrip": false,
401
+ "single_word": false,
402
+ "special": true
403
+ },
404
+ "128050": {
405
+ "content": "<|reserved_special_token_42|>",
406
+ "lstrip": false,
407
+ "normalized": false,
408
+ "rstrip": false,
409
+ "single_word": false,
410
+ "special": true
411
+ },
412
+ "128051": {
413
+ "content": "<|reserved_special_token_43|>",
414
+ "lstrip": false,
415
+ "normalized": false,
416
+ "rstrip": false,
417
+ "single_word": false,
418
+ "special": true
419
+ },
420
+ "128052": {
421
+ "content": "<|reserved_special_token_44|>",
422
+ "lstrip": false,
423
+ "normalized": false,
424
+ "rstrip": false,
425
+ "single_word": false,
426
+ "special": true
427
+ },
428
+ "128053": {
429
+ "content": "<|reserved_special_token_45|>",
430
+ "lstrip": false,
431
+ "normalized": false,
432
+ "rstrip": false,
433
+ "single_word": false,
434
+ "special": true
435
+ },
436
+ "128054": {
437
+ "content": "<|reserved_special_token_46|>",
438
+ "lstrip": false,
439
+ "normalized": false,
440
+ "rstrip": false,
441
+ "single_word": false,
442
+ "special": true
443
+ },
444
+ "128055": {
445
+ "content": "<|reserved_special_token_47|>",
446
+ "lstrip": false,
447
+ "normalized": false,
448
+ "rstrip": false,
449
+ "single_word": false,
450
+ "special": true
451
+ },
452
+ "128056": {
453
+ "content": "<|reserved_special_token_48|>",
454
+ "lstrip": false,
455
+ "normalized": false,
456
+ "rstrip": false,
457
+ "single_word": false,
458
+ "special": true
459
+ },
460
+ "128057": {
461
+ "content": "<|reserved_special_token_49|>",
462
+ "lstrip": false,
463
+ "normalized": false,
464
+ "rstrip": false,
465
+ "single_word": false,
466
+ "special": true
467
+ },
468
+ "128058": {
469
+ "content": "<|reserved_special_token_50|>",
470
+ "lstrip": false,
471
+ "normalized": false,
472
+ "rstrip": false,
473
+ "single_word": false,
474
+ "special": true
475
+ },
476
+ "128059": {
477
+ "content": "<|reserved_special_token_51|>",
478
+ "lstrip": false,
479
+ "normalized": false,
480
+ "rstrip": false,
481
+ "single_word": false,
482
+ "special": true
483
+ },
484
+ "128060": {
485
+ "content": "<|reserved_special_token_52|>",
486
+ "lstrip": false,
487
+ "normalized": false,
488
+ "rstrip": false,
489
+ "single_word": false,
490
+ "special": true
491
+ },
492
+ "128061": {
493
+ "content": "<|reserved_special_token_53|>",
494
+ "lstrip": false,
495
+ "normalized": false,
496
+ "rstrip": false,
497
+ "single_word": false,
498
+ "special": true
499
+ },
500
+ "128062": {
501
+ "content": "<|reserved_special_token_54|>",
502
+ "lstrip": false,
503
+ "normalized": false,
504
+ "rstrip": false,
505
+ "single_word": false,
506
+ "special": true
507
+ },
508
+ "128063": {
509
+ "content": "<|reserved_special_token_55|>",
510
+ "lstrip": false,
511
+ "normalized": false,
512
+ "rstrip": false,
513
+ "single_word": false,
514
+ "special": true
515
+ },
516
+ "128064": {
517
+ "content": "<|reserved_special_token_56|>",
518
+ "lstrip": false,
519
+ "normalized": false,
520
+ "rstrip": false,
521
+ "single_word": false,
522
+ "special": true
523
+ },
524
+ "128065": {
525
+ "content": "<|reserved_special_token_57|>",
526
+ "lstrip": false,
527
+ "normalized": false,
528
+ "rstrip": false,
529
+ "single_word": false,
530
+ "special": true
531
+ },
532
+ "128066": {
533
+ "content": "<|reserved_special_token_58|>",
534
+ "lstrip": false,
535
+ "normalized": false,
536
+ "rstrip": false,
537
+ "single_word": false,
538
+ "special": true
539
+ },
540
+ "128067": {
541
+ "content": "<|reserved_special_token_59|>",
542
+ "lstrip": false,
543
+ "normalized": false,
544
+ "rstrip": false,
545
+ "single_word": false,
546
+ "special": true
547
+ },
548
+ "128068": {
549
+ "content": "<|reserved_special_token_60|>",
550
+ "lstrip": false,
551
+ "normalized": false,
552
+ "rstrip": false,
553
+ "single_word": false,
554
+ "special": true
555
+ },
556
+ "128069": {
557
+ "content": "<|reserved_special_token_61|>",
558
+ "lstrip": false,
559
+ "normalized": false,
560
+ "rstrip": false,
561
+ "single_word": false,
562
+ "special": true
563
+ },
564
+ "128070": {
565
+ "content": "<|reserved_special_token_62|>",
566
+ "lstrip": false,
567
+ "normalized": false,
568
+ "rstrip": false,
569
+ "single_word": false,
570
+ "special": true
571
+ },
572
+ "128071": {
573
+ "content": "<|reserved_special_token_63|>",
574
+ "lstrip": false,
575
+ "normalized": false,
576
+ "rstrip": false,
577
+ "single_word": false,
578
+ "special": true
579
+ },
580
+ "128072": {
581
+ "content": "<|reserved_special_token_64|>",
582
+ "lstrip": false,
583
+ "normalized": false,
584
+ "rstrip": false,
585
+ "single_word": false,
586
+ "special": true
587
+ },
588
+ "128073": {
589
+ "content": "<|reserved_special_token_65|>",
590
+ "lstrip": false,
591
+ "normalized": false,
592
+ "rstrip": false,
593
+ "single_word": false,
594
+ "special": true
595
+ },
596
+ "128074": {
597
+ "content": "<|reserved_special_token_66|>",
598
+ "lstrip": false,
599
+ "normalized": false,
600
+ "rstrip": false,
601
+ "single_word": false,
602
+ "special": true
603
+ },
604
+ "128075": {
605
+ "content": "<|reserved_special_token_67|>",
606
+ "lstrip": false,
607
+ "normalized": false,
608
+ "rstrip": false,
609
+ "single_word": false,
610
+ "special": true
611
+ },
612
+ "128076": {
613
+ "content": "<|reserved_special_token_68|>",
614
+ "lstrip": false,
615
+ "normalized": false,
616
+ "rstrip": false,
617
+ "single_word": false,
618
+ "special": true
619
+ },
620
+ "128077": {
621
+ "content": "<|reserved_special_token_69|>",
622
+ "lstrip": false,
623
+ "normalized": false,
624
+ "rstrip": false,
625
+ "single_word": false,
626
+ "special": true
627
+ },
628
+ "128078": {
629
+ "content": "<|reserved_special_token_70|>",
630
+ "lstrip": false,
631
+ "normalized": false,
632
+ "rstrip": false,
633
+ "single_word": false,
634
+ "special": true
635
+ },
636
+ "128079": {
637
+ "content": "<|reserved_special_token_71|>",
638
+ "lstrip": false,
639
+ "normalized": false,
640
+ "rstrip": false,
641
+ "single_word": false,
642
+ "special": true
643
+ },
644
+ "128080": {
645
+ "content": "<|reserved_special_token_72|>",
646
+ "lstrip": false,
647
+ "normalized": false,
648
+ "rstrip": false,
649
+ "single_word": false,
650
+ "special": true
651
+ },
652
+ "128081": {
653
+ "content": "<|reserved_special_token_73|>",
654
+ "lstrip": false,
655
+ "normalized": false,
656
+ "rstrip": false,
657
+ "single_word": false,
658
+ "special": true
659
+ },
660
+ "128082": {
661
+ "content": "<|reserved_special_token_74|>",
662
+ "lstrip": false,
663
+ "normalized": false,
664
+ "rstrip": false,
665
+ "single_word": false,
666
+ "special": true
667
+ },
668
+ "128083": {
669
+ "content": "<|reserved_special_token_75|>",
670
+ "lstrip": false,
671
+ "normalized": false,
672
+ "rstrip": false,
673
+ "single_word": false,
674
+ "special": true
675
+ },
676
+ "128084": {
677
+ "content": "<|reserved_special_token_76|>",
678
+ "lstrip": false,
679
+ "normalized": false,
680
+ "rstrip": false,
681
+ "single_word": false,
682
+ "special": true
683
+ },
684
+ "128085": {
685
+ "content": "<|reserved_special_token_77|>",
686
+ "lstrip": false,
687
+ "normalized": false,
688
+ "rstrip": false,
689
+ "single_word": false,
690
+ "special": true
691
+ },
692
+ "128086": {
693
+ "content": "<|reserved_special_token_78|>",
694
+ "lstrip": false,
695
+ "normalized": false,
696
+ "rstrip": false,
697
+ "single_word": false,
698
+ "special": true
699
+ },
700
+ "128087": {
701
+ "content": "<|reserved_special_token_79|>",
702
+ "lstrip": false,
703
+ "normalized": false,
704
+ "rstrip": false,
705
+ "single_word": false,
706
+ "special": true
707
+ },
708
+ "128088": {
709
+ "content": "<|reserved_special_token_80|>",
710
+ "lstrip": false,
711
+ "normalized": false,
712
+ "rstrip": false,
713
+ "single_word": false,
714
+ "special": true
715
+ },
716
+ "128089": {
717
+ "content": "<|reserved_special_token_81|>",
718
+ "lstrip": false,
719
+ "normalized": false,
720
+ "rstrip": false,
721
+ "single_word": false,
722
+ "special": true
723
+ },
724
+ "128090": {
725
+ "content": "<|reserved_special_token_82|>",
726
+ "lstrip": false,
727
+ "normalized": false,
728
+ "rstrip": false,
729
+ "single_word": false,
730
+ "special": true
731
+ },
732
+ "128091": {
733
+ "content": "<|reserved_special_token_83|>",
734
+ "lstrip": false,
735
+ "normalized": false,
736
+ "rstrip": false,
737
+ "single_word": false,
738
+ "special": true
739
+ },
740
+ "128092": {
741
+ "content": "<|reserved_special_token_84|>",
742
+ "lstrip": false,
743
+ "normalized": false,
744
+ "rstrip": false,
745
+ "single_word": false,
746
+ "special": true
747
+ },
748
+ "128093": {
749
+ "content": "<|reserved_special_token_85|>",
750
+ "lstrip": false,
751
+ "normalized": false,
752
+ "rstrip": false,
753
+ "single_word": false,
754
+ "special": true
755
+ },
756
+ "128094": {
757
+ "content": "<|reserved_special_token_86|>",
758
+ "lstrip": false,
759
+ "normalized": false,
760
+ "rstrip": false,
761
+ "single_word": false,
762
+ "special": true
763
+ },
764
+ "128095": {
765
+ "content": "<|reserved_special_token_87|>",
766
+ "lstrip": false,
767
+ "normalized": false,
768
+ "rstrip": false,
769
+ "single_word": false,
770
+ "special": true
771
+ },
772
+ "128096": {
773
+ "content": "<|reserved_special_token_88|>",
774
+ "lstrip": false,
775
+ "normalized": false,
776
+ "rstrip": false,
777
+ "single_word": false,
778
+ "special": true
779
+ },
780
+ "128097": {
781
+ "content": "<|reserved_special_token_89|>",
782
+ "lstrip": false,
783
+ "normalized": false,
784
+ "rstrip": false,
785
+ "single_word": false,
786
+ "special": true
787
+ },
788
+ "128098": {
789
+ "content": "<|reserved_special_token_90|>",
790
+ "lstrip": false,
791
+ "normalized": false,
792
+ "rstrip": false,
793
+ "single_word": false,
794
+ "special": true
795
+ },
796
+ "128099": {
797
+ "content": "<|reserved_special_token_91|>",
798
+ "lstrip": false,
799
+ "normalized": false,
800
+ "rstrip": false,
801
+ "single_word": false,
802
+ "special": true
803
+ },
804
+ "128100": {
805
+ "content": "<|reserved_special_token_92|>",
806
+ "lstrip": false,
807
+ "normalized": false,
808
+ "rstrip": false,
809
+ "single_word": false,
810
+ "special": true
811
+ },
812
+ "128101": {
813
+ "content": "<|reserved_special_token_93|>",
814
+ "lstrip": false,
815
+ "normalized": false,
816
+ "rstrip": false,
817
+ "single_word": false,
818
+ "special": true
819
+ },
820
+ "128102": {
821
+ "content": "<|reserved_special_token_94|>",
822
+ "lstrip": false,
823
+ "normalized": false,
824
+ "rstrip": false,
825
+ "single_word": false,
826
+ "special": true
827
+ },
828
+ "128103": {
829
+ "content": "<|reserved_special_token_95|>",
830
+ "lstrip": false,
831
+ "normalized": false,
832
+ "rstrip": false,
833
+ "single_word": false,
834
+ "special": true
835
+ },
836
+ "128104": {
837
+ "content": "<|reserved_special_token_96|>",
838
+ "lstrip": false,
839
+ "normalized": false,
840
+ "rstrip": false,
841
+ "single_word": false,
842
+ "special": true
843
+ },
844
+ "128105": {
845
+ "content": "<|reserved_special_token_97|>",
846
+ "lstrip": false,
847
+ "normalized": false,
848
+ "rstrip": false,
849
+ "single_word": false,
850
+ "special": true
851
+ },
852
+ "128106": {
853
+ "content": "<|reserved_special_token_98|>",
854
+ "lstrip": false,
855
+ "normalized": false,
856
+ "rstrip": false,
857
+ "single_word": false,
858
+ "special": true
859
+ },
860
+ "128107": {
861
+ "content": "<|reserved_special_token_99|>",
862
+ "lstrip": false,
863
+ "normalized": false,
864
+ "rstrip": false,
865
+ "single_word": false,
866
+ "special": true
867
+ },
868
+ "128108": {
869
+ "content": "<|reserved_special_token_100|>",
870
+ "lstrip": false,
871
+ "normalized": false,
872
+ "rstrip": false,
873
+ "single_word": false,
874
+ "special": true
875
+ },
876
+ "128109": {
877
+ "content": "<|reserved_special_token_101|>",
878
+ "lstrip": false,
879
+ "normalized": false,
880
+ "rstrip": false,
881
+ "single_word": false,
882
+ "special": true
883
+ },
884
+ "128110": {
885
+ "content": "<|reserved_special_token_102|>",
886
+ "lstrip": false,
887
+ "normalized": false,
888
+ "rstrip": false,
889
+ "single_word": false,
890
+ "special": true
891
+ },
892
+ "128111": {
893
+ "content": "<|reserved_special_token_103|>",
894
+ "lstrip": false,
895
+ "normalized": false,
896
+ "rstrip": false,
897
+ "single_word": false,
898
+ "special": true
899
+ },
900
+ "128112": {
901
+ "content": "<|reserved_special_token_104|>",
902
+ "lstrip": false,
903
+ "normalized": false,
904
+ "rstrip": false,
905
+ "single_word": false,
906
+ "special": true
907
+ },
908
+ "128113": {
909
+ "content": "<|reserved_special_token_105|>",
910
+ "lstrip": false,
911
+ "normalized": false,
912
+ "rstrip": false,
913
+ "single_word": false,
914
+ "special": true
915
+ },
916
+ "128114": {
917
+ "content": "<|reserved_special_token_106|>",
918
+ "lstrip": false,
919
+ "normalized": false,
920
+ "rstrip": false,
921
+ "single_word": false,
922
+ "special": true
923
+ },
924
+ "128115": {
925
+ "content": "<|reserved_special_token_107|>",
926
+ "lstrip": false,
927
+ "normalized": false,
928
+ "rstrip": false,
929
+ "single_word": false,
930
+ "special": true
931
+ },
932
+ "128116": {
933
+ "content": "<|reserved_special_token_108|>",
934
+ "lstrip": false,
935
+ "normalized": false,
936
+ "rstrip": false,
937
+ "single_word": false,
938
+ "special": true
939
+ },
940
+ "128117": {
941
+ "content": "<|reserved_special_token_109|>",
942
+ "lstrip": false,
943
+ "normalized": false,
944
+ "rstrip": false,
945
+ "single_word": false,
946
+ "special": true
947
+ },
948
+ "128118": {
949
+ "content": "<|reserved_special_token_110|>",
950
+ "lstrip": false,
951
+ "normalized": false,
952
+ "rstrip": false,
953
+ "single_word": false,
954
+ "special": true
955
+ },
956
+ "128119": {
957
+ "content": "<|reserved_special_token_111|>",
958
+ "lstrip": false,
959
+ "normalized": false,
960
+ "rstrip": false,
961
+ "single_word": false,
962
+ "special": true
963
+ },
964
+ "128120": {
965
+ "content": "<|reserved_special_token_112|>",
966
+ "lstrip": false,
967
+ "normalized": false,
968
+ "rstrip": false,
969
+ "single_word": false,
970
+ "special": true
971
+ },
972
+ "128121": {
973
+ "content": "<|reserved_special_token_113|>",
974
+ "lstrip": false,
975
+ "normalized": false,
976
+ "rstrip": false,
977
+ "single_word": false,
978
+ "special": true
979
+ },
980
+ "128122": {
981
+ "content": "<|reserved_special_token_114|>",
982
+ "lstrip": false,
983
+ "normalized": false,
984
+ "rstrip": false,
985
+ "single_word": false,
986
+ "special": true
987
+ },
988
+ "128123": {
989
+ "content": "<|reserved_special_token_115|>",
990
+ "lstrip": false,
991
+ "normalized": false,
992
+ "rstrip": false,
993
+ "single_word": false,
994
+ "special": true
995
+ },
996
+ "128124": {
997
+ "content": "<|reserved_special_token_116|>",
998
+ "lstrip": false,
999
+ "normalized": false,
1000
+ "rstrip": false,
1001
+ "single_word": false,
1002
+ "special": true
1003
+ },
1004
+ "128125": {
1005
+ "content": "<|reserved_special_token_117|>",
1006
+ "lstrip": false,
1007
+ "normalized": false,
1008
+ "rstrip": false,
1009
+ "single_word": false,
1010
+ "special": true
1011
+ },
1012
+ "128126": {
1013
+ "content": "<|reserved_special_token_118|>",
1014
+ "lstrip": false,
1015
+ "normalized": false,
1016
+ "rstrip": false,
1017
+ "single_word": false,
1018
+ "special": true
1019
+ },
1020
+ "128127": {
1021
+ "content": "<|reserved_special_token_119|>",
1022
+ "lstrip": false,
1023
+ "normalized": false,
1024
+ "rstrip": false,
1025
+ "single_word": false,
1026
+ "special": true
1027
+ },
1028
+ "128128": {
1029
+ "content": "<|reserved_special_token_120|>",
1030
+ "lstrip": false,
1031
+ "normalized": false,
1032
+ "rstrip": false,
1033
+ "single_word": false,
1034
+ "special": true
1035
+ },
1036
+ "128129": {
1037
+ "content": "<|reserved_special_token_121|>",
1038
+ "lstrip": false,
1039
+ "normalized": false,
1040
+ "rstrip": false,
1041
+ "single_word": false,
1042
+ "special": true
1043
+ },
1044
+ "128130": {
1045
+ "content": "<|reserved_special_token_122|>",
1046
+ "lstrip": false,
1047
+ "normalized": false,
1048
+ "rstrip": false,
1049
+ "single_word": false,
1050
+ "special": true
1051
+ },
1052
+ "128131": {
1053
+ "content": "<|reserved_special_token_123|>",
1054
+ "lstrip": false,
1055
+ "normalized": false,
1056
+ "rstrip": false,
1057
+ "single_word": false,
1058
+ "special": true
1059
+ },
1060
+ "128132": {
1061
+ "content": "<|reserved_special_token_124|>",
1062
+ "lstrip": false,
1063
+ "normalized": false,
1064
+ "rstrip": false,
1065
+ "single_word": false,
1066
+ "special": true
1067
+ },
1068
+ "128133": {
1069
+ "content": "<|reserved_special_token_125|>",
1070
+ "lstrip": false,
1071
+ "normalized": false,
1072
+ "rstrip": false,
1073
+ "single_word": false,
1074
+ "special": true
1075
+ },
1076
+ "128134": {
1077
+ "content": "<|reserved_special_token_126|>",
1078
+ "lstrip": false,
1079
+ "normalized": false,
1080
+ "rstrip": false,
1081
+ "single_word": false,
1082
+ "special": true
1083
+ },
1084
+ "128135": {
1085
+ "content": "<|reserved_special_token_127|>",
1086
+ "lstrip": false,
1087
+ "normalized": false,
1088
+ "rstrip": false,
1089
+ "single_word": false,
1090
+ "special": true
1091
+ },
1092
+ "128136": {
1093
+ "content": "<|reserved_special_token_128|>",
1094
+ "lstrip": false,
1095
+ "normalized": false,
1096
+ "rstrip": false,
1097
+ "single_word": false,
1098
+ "special": true
1099
+ },
1100
+ "128137": {
1101
+ "content": "<|reserved_special_token_129|>",
1102
+ "lstrip": false,
1103
+ "normalized": false,
1104
+ "rstrip": false,
1105
+ "single_word": false,
1106
+ "special": true
1107
+ },
1108
+ "128138": {
1109
+ "content": "<|reserved_special_token_130|>",
1110
+ "lstrip": false,
1111
+ "normalized": false,
1112
+ "rstrip": false,
1113
+ "single_word": false,
1114
+ "special": true
1115
+ },
1116
+ "128139": {
1117
+ "content": "<|reserved_special_token_131|>",
1118
+ "lstrip": false,
1119
+ "normalized": false,
1120
+ "rstrip": false,
1121
+ "single_word": false,
1122
+ "special": true
1123
+ },
1124
+ "128140": {
1125
+ "content": "<|reserved_special_token_132|>",
1126
+ "lstrip": false,
1127
+ "normalized": false,
1128
+ "rstrip": false,
1129
+ "single_word": false,
1130
+ "special": true
1131
+ },
1132
+ "128141": {
1133
+ "content": "<|reserved_special_token_133|>",
1134
+ "lstrip": false,
1135
+ "normalized": false,
1136
+ "rstrip": false,
1137
+ "single_word": false,
1138
+ "special": true
1139
+ },
1140
+ "128142": {
1141
+ "content": "<|reserved_special_token_134|>",
1142
+ "lstrip": false,
1143
+ "normalized": false,
1144
+ "rstrip": false,
1145
+ "single_word": false,
1146
+ "special": true
1147
+ },
1148
+ "128143": {
1149
+ "content": "<|reserved_special_token_135|>",
1150
+ "lstrip": false,
1151
+ "normalized": false,
1152
+ "rstrip": false,
1153
+ "single_word": false,
1154
+ "special": true
1155
+ },
1156
+ "128144": {
1157
+ "content": "<|reserved_special_token_136|>",
1158
+ "lstrip": false,
1159
+ "normalized": false,
1160
+ "rstrip": false,
1161
+ "single_word": false,
1162
+ "special": true
1163
+ },
1164
+ "128145": {
1165
+ "content": "<|reserved_special_token_137|>",
1166
+ "lstrip": false,
1167
+ "normalized": false,
1168
+ "rstrip": false,
1169
+ "single_word": false,
1170
+ "special": true
1171
+ },
1172
+ "128146": {
1173
+ "content": "<|reserved_special_token_138|>",
1174
+ "lstrip": false,
1175
+ "normalized": false,
1176
+ "rstrip": false,
1177
+ "single_word": false,
1178
+ "special": true
1179
+ },
1180
+ "128147": {
1181
+ "content": "<|reserved_special_token_139|>",
1182
+ "lstrip": false,
1183
+ "normalized": false,
1184
+ "rstrip": false,
1185
+ "single_word": false,
1186
+ "special": true
1187
+ },
1188
+ "128148": {
1189
+ "content": "<|reserved_special_token_140|>",
1190
+ "lstrip": false,
1191
+ "normalized": false,
1192
+ "rstrip": false,
1193
+ "single_word": false,
1194
+ "special": true
1195
+ },
1196
+ "128149": {
1197
+ "content": "<|reserved_special_token_141|>",
1198
+ "lstrip": false,
1199
+ "normalized": false,
1200
+ "rstrip": false,
1201
+ "single_word": false,
1202
+ "special": true
1203
+ },
1204
+ "128150": {
1205
+ "content": "<|reserved_special_token_142|>",
1206
+ "lstrip": false,
1207
+ "normalized": false,
1208
+ "rstrip": false,
1209
+ "single_word": false,
1210
+ "special": true
1211
+ },
1212
+ "128151": {
1213
+ "content": "<|reserved_special_token_143|>",
1214
+ "lstrip": false,
1215
+ "normalized": false,
1216
+ "rstrip": false,
1217
+ "single_word": false,
1218
+ "special": true
1219
+ },
1220
+ "128152": {
1221
+ "content": "<|reserved_special_token_144|>",
1222
+ "lstrip": false,
1223
+ "normalized": false,
1224
+ "rstrip": false,
1225
+ "single_word": false,
1226
+ "special": true
1227
+ },
1228
+ "128153": {
1229
+ "content": "<|reserved_special_token_145|>",
1230
+ "lstrip": false,
1231
+ "normalized": false,
1232
+ "rstrip": false,
1233
+ "single_word": false,
1234
+ "special": true
1235
+ },
1236
+ "128154": {
1237
+ "content": "<|reserved_special_token_146|>",
1238
+ "lstrip": false,
1239
+ "normalized": false,
1240
+ "rstrip": false,
1241
+ "single_word": false,
1242
+ "special": true
1243
+ },
1244
+ "128155": {
1245
+ "content": "<|reserved_special_token_147|>",
1246
+ "lstrip": false,
1247
+ "normalized": false,
1248
+ "rstrip": false,
1249
+ "single_word": false,
1250
+ "special": true
1251
+ },
1252
+ "128156": {
1253
+ "content": "<|reserved_special_token_148|>",
1254
+ "lstrip": false,
1255
+ "normalized": false,
1256
+ "rstrip": false,
1257
+ "single_word": false,
1258
+ "special": true
1259
+ },
1260
+ "128157": {
1261
+ "content": "<|reserved_special_token_149|>",
1262
+ "lstrip": false,
1263
+ "normalized": false,
1264
+ "rstrip": false,
1265
+ "single_word": false,
1266
+ "special": true
1267
+ },
1268
+ "128158": {
1269
+ "content": "<|reserved_special_token_150|>",
1270
+ "lstrip": false,
1271
+ "normalized": false,
1272
+ "rstrip": false,
1273
+ "single_word": false,
1274
+ "special": true
1275
+ },
1276
+ "128159": {
1277
+ "content": "<|reserved_special_token_151|>",
1278
+ "lstrip": false,
1279
+ "normalized": false,
1280
+ "rstrip": false,
1281
+ "single_word": false,
1282
+ "special": true
1283
+ },
1284
+ "128160": {
1285
+ "content": "<|reserved_special_token_152|>",
1286
+ "lstrip": false,
1287
+ "normalized": false,
1288
+ "rstrip": false,
1289
+ "single_word": false,
1290
+ "special": true
1291
+ },
1292
+ "128161": {
1293
+ "content": "<|reserved_special_token_153|>",
1294
+ "lstrip": false,
1295
+ "normalized": false,
1296
+ "rstrip": false,
1297
+ "single_word": false,
1298
+ "special": true
1299
+ },
1300
+ "128162": {
1301
+ "content": "<|reserved_special_token_154|>",
1302
+ "lstrip": false,
1303
+ "normalized": false,
1304
+ "rstrip": false,
1305
+ "single_word": false,
1306
+ "special": true
1307
+ },
1308
+ "128163": {
1309
+ "content": "<|reserved_special_token_155|>",
1310
+ "lstrip": false,
1311
+ "normalized": false,
1312
+ "rstrip": false,
1313
+ "single_word": false,
1314
+ "special": true
1315
+ },
1316
+ "128164": {
1317
+ "content": "<|reserved_special_token_156|>",
1318
+ "lstrip": false,
1319
+ "normalized": false,
1320
+ "rstrip": false,
1321
+ "single_word": false,
1322
+ "special": true
1323
+ },
1324
+ "128165": {
1325
+ "content": "<|reserved_special_token_157|>",
1326
+ "lstrip": false,
1327
+ "normalized": false,
1328
+ "rstrip": false,
1329
+ "single_word": false,
1330
+ "special": true
1331
+ },
1332
+ "128166": {
1333
+ "content": "<|reserved_special_token_158|>",
1334
+ "lstrip": false,
1335
+ "normalized": false,
1336
+ "rstrip": false,
1337
+ "single_word": false,
1338
+ "special": true
1339
+ },
1340
+ "128167": {
1341
+ "content": "<|reserved_special_token_159|>",
1342
+ "lstrip": false,
1343
+ "normalized": false,
1344
+ "rstrip": false,
1345
+ "single_word": false,
1346
+ "special": true
1347
+ },
1348
+ "128168": {
1349
+ "content": "<|reserved_special_token_160|>",
1350
+ "lstrip": false,
1351
+ "normalized": false,
1352
+ "rstrip": false,
1353
+ "single_word": false,
1354
+ "special": true
1355
+ },
1356
+ "128169": {
1357
+ "content": "<|reserved_special_token_161|>",
1358
+ "lstrip": false,
1359
+ "normalized": false,
1360
+ "rstrip": false,
1361
+ "single_word": false,
1362
+ "special": true
1363
+ },
1364
+ "128170": {
1365
+ "content": "<|reserved_special_token_162|>",
1366
+ "lstrip": false,
1367
+ "normalized": false,
1368
+ "rstrip": false,
1369
+ "single_word": false,
1370
+ "special": true
1371
+ },
1372
+ "128171": {
1373
+ "content": "<|reserved_special_token_163|>",
1374
+ "lstrip": false,
1375
+ "normalized": false,
1376
+ "rstrip": false,
1377
+ "single_word": false,
1378
+ "special": true
1379
+ },
1380
+ "128172": {
1381
+ "content": "<|reserved_special_token_164|>",
1382
+ "lstrip": false,
1383
+ "normalized": false,
1384
+ "rstrip": false,
1385
+ "single_word": false,
1386
+ "special": true
1387
+ },
1388
+ "128173": {
1389
+ "content": "<|reserved_special_token_165|>",
1390
+ "lstrip": false,
1391
+ "normalized": false,
1392
+ "rstrip": false,
1393
+ "single_word": false,
1394
+ "special": true
1395
+ },
1396
+ "128174": {
1397
+ "content": "<|reserved_special_token_166|>",
1398
+ "lstrip": false,
1399
+ "normalized": false,
1400
+ "rstrip": false,
1401
+ "single_word": false,
1402
+ "special": true
1403
+ },
1404
+ "128175": {
1405
+ "content": "<|reserved_special_token_167|>",
1406
+ "lstrip": false,
1407
+ "normalized": false,
1408
+ "rstrip": false,
1409
+ "single_word": false,
1410
+ "special": true
1411
+ },
1412
+ "128176": {
1413
+ "content": "<|reserved_special_token_168|>",
1414
+ "lstrip": false,
1415
+ "normalized": false,
1416
+ "rstrip": false,
1417
+ "single_word": false,
1418
+ "special": true
1419
+ },
1420
+ "128177": {
1421
+ "content": "<|reserved_special_token_169|>",
1422
+ "lstrip": false,
1423
+ "normalized": false,
1424
+ "rstrip": false,
1425
+ "single_word": false,
1426
+ "special": true
1427
+ },
1428
+ "128178": {
1429
+ "content": "<|reserved_special_token_170|>",
1430
+ "lstrip": false,
1431
+ "normalized": false,
1432
+ "rstrip": false,
1433
+ "single_word": false,
1434
+ "special": true
1435
+ },
1436
+ "128179": {
1437
+ "content": "<|reserved_special_token_171|>",
1438
+ "lstrip": false,
1439
+ "normalized": false,
1440
+ "rstrip": false,
1441
+ "single_word": false,
1442
+ "special": true
1443
+ },
1444
+ "128180": {
1445
+ "content": "<|reserved_special_token_172|>",
1446
+ "lstrip": false,
1447
+ "normalized": false,
1448
+ "rstrip": false,
1449
+ "single_word": false,
1450
+ "special": true
1451
+ },
1452
+ "128181": {
1453
+ "content": "<|reserved_special_token_173|>",
1454
+ "lstrip": false,
1455
+ "normalized": false,
1456
+ "rstrip": false,
1457
+ "single_word": false,
1458
+ "special": true
1459
+ },
1460
+ "128182": {
1461
+ "content": "<|reserved_special_token_174|>",
1462
+ "lstrip": false,
1463
+ "normalized": false,
1464
+ "rstrip": false,
1465
+ "single_word": false,
1466
+ "special": true
1467
+ },
1468
+ "128183": {
1469
+ "content": "<|reserved_special_token_175|>",
1470
+ "lstrip": false,
1471
+ "normalized": false,
1472
+ "rstrip": false,
1473
+ "single_word": false,
1474
+ "special": true
1475
+ },
1476
+ "128184": {
1477
+ "content": "<|reserved_special_token_176|>",
1478
+ "lstrip": false,
1479
+ "normalized": false,
1480
+ "rstrip": false,
1481
+ "single_word": false,
1482
+ "special": true
1483
+ },
1484
+ "128185": {
1485
+ "content": "<|reserved_special_token_177|>",
1486
+ "lstrip": false,
1487
+ "normalized": false,
1488
+ "rstrip": false,
1489
+ "single_word": false,
1490
+ "special": true
1491
+ },
1492
+ "128186": {
1493
+ "content": "<|reserved_special_token_178|>",
1494
+ "lstrip": false,
1495
+ "normalized": false,
1496
+ "rstrip": false,
1497
+ "single_word": false,
1498
+ "special": true
1499
+ },
1500
+ "128187": {
1501
+ "content": "<|reserved_special_token_179|>",
1502
+ "lstrip": false,
1503
+ "normalized": false,
1504
+ "rstrip": false,
1505
+ "single_word": false,
1506
+ "special": true
1507
+ },
1508
+ "128188": {
1509
+ "content": "<|reserved_special_token_180|>",
1510
+ "lstrip": false,
1511
+ "normalized": false,
1512
+ "rstrip": false,
1513
+ "single_word": false,
1514
+ "special": true
1515
+ },
1516
+ "128189": {
1517
+ "content": "<|reserved_special_token_181|>",
1518
+ "lstrip": false,
1519
+ "normalized": false,
1520
+ "rstrip": false,
1521
+ "single_word": false,
1522
+ "special": true
1523
+ },
1524
+ "128190": {
1525
+ "content": "<|reserved_special_token_182|>",
1526
+ "lstrip": false,
1527
+ "normalized": false,
1528
+ "rstrip": false,
1529
+ "single_word": false,
1530
+ "special": true
1531
+ },
1532
+ "128191": {
1533
+ "content": "<|reserved_special_token_183|>",
1534
+ "lstrip": false,
1535
+ "normalized": false,
1536
+ "rstrip": false,
1537
+ "single_word": false,
1538
+ "special": true
1539
+ },
1540
+ "128192": {
1541
+ "content": "<|reserved_special_token_184|>",
1542
+ "lstrip": false,
1543
+ "normalized": false,
1544
+ "rstrip": false,
1545
+ "single_word": false,
1546
+ "special": true
1547
+ },
1548
+ "128193": {
1549
+ "content": "<|reserved_special_token_185|>",
1550
+ "lstrip": false,
1551
+ "normalized": false,
1552
+ "rstrip": false,
1553
+ "single_word": false,
1554
+ "special": true
1555
+ },
1556
+ "128194": {
1557
+ "content": "<|reserved_special_token_186|>",
1558
+ "lstrip": false,
1559
+ "normalized": false,
1560
+ "rstrip": false,
1561
+ "single_word": false,
1562
+ "special": true
1563
+ },
1564
+ "128195": {
1565
+ "content": "<|reserved_special_token_187|>",
1566
+ "lstrip": false,
1567
+ "normalized": false,
1568
+ "rstrip": false,
1569
+ "single_word": false,
1570
+ "special": true
1571
+ },
1572
+ "128196": {
1573
+ "content": "<|reserved_special_token_188|>",
1574
+ "lstrip": false,
1575
+ "normalized": false,
1576
+ "rstrip": false,
1577
+ "single_word": false,
1578
+ "special": true
1579
+ },
1580
+ "128197": {
1581
+ "content": "<|reserved_special_token_189|>",
1582
+ "lstrip": false,
1583
+ "normalized": false,
1584
+ "rstrip": false,
1585
+ "single_word": false,
1586
+ "special": true
1587
+ },
1588
+ "128198": {
1589
+ "content": "<|reserved_special_token_190|>",
1590
+ "lstrip": false,
1591
+ "normalized": false,
1592
+ "rstrip": false,
1593
+ "single_word": false,
1594
+ "special": true
1595
+ },
1596
+ "128199": {
1597
+ "content": "<|reserved_special_token_191|>",
1598
+ "lstrip": false,
1599
+ "normalized": false,
1600
+ "rstrip": false,
1601
+ "single_word": false,
1602
+ "special": true
1603
+ },
1604
+ "128200": {
1605
+ "content": "<|reserved_special_token_192|>",
1606
+ "lstrip": false,
1607
+ "normalized": false,
1608
+ "rstrip": false,
1609
+ "single_word": false,
1610
+ "special": true
1611
+ },
1612
+ "128201": {
1613
+ "content": "<|reserved_special_token_193|>",
1614
+ "lstrip": false,
1615
+ "normalized": false,
1616
+ "rstrip": false,
1617
+ "single_word": false,
1618
+ "special": true
1619
+ },
1620
+ "128202": {
1621
+ "content": "<|reserved_special_token_194|>",
1622
+ "lstrip": false,
1623
+ "normalized": false,
1624
+ "rstrip": false,
1625
+ "single_word": false,
1626
+ "special": true
1627
+ },
1628
+ "128203": {
1629
+ "content": "<|reserved_special_token_195|>",
1630
+ "lstrip": false,
1631
+ "normalized": false,
1632
+ "rstrip": false,
1633
+ "single_word": false,
1634
+ "special": true
1635
+ },
1636
+ "128204": {
1637
+ "content": "<|reserved_special_token_196|>",
1638
+ "lstrip": false,
1639
+ "normalized": false,
1640
+ "rstrip": false,
1641
+ "single_word": false,
1642
+ "special": true
1643
+ },
1644
+ "128205": {
1645
+ "content": "<|reserved_special_token_197|>",
1646
+ "lstrip": false,
1647
+ "normalized": false,
1648
+ "rstrip": false,
1649
+ "single_word": false,
1650
+ "special": true
1651
+ },
1652
+ "128206": {
1653
+ "content": "<|reserved_special_token_198|>",
1654
+ "lstrip": false,
1655
+ "normalized": false,
1656
+ "rstrip": false,
1657
+ "single_word": false,
1658
+ "special": true
1659
+ },
1660
+ "128207": {
1661
+ "content": "<|reserved_special_token_199|>",
1662
+ "lstrip": false,
1663
+ "normalized": false,
1664
+ "rstrip": false,
1665
+ "single_word": false,
1666
+ "special": true
1667
+ },
1668
+ "128208": {
1669
+ "content": "<|reserved_special_token_200|>",
1670
+ "lstrip": false,
1671
+ "normalized": false,
1672
+ "rstrip": false,
1673
+ "single_word": false,
1674
+ "special": true
1675
+ },
1676
+ "128209": {
1677
+ "content": "<|reserved_special_token_201|>",
1678
+ "lstrip": false,
1679
+ "normalized": false,
1680
+ "rstrip": false,
1681
+ "single_word": false,
1682
+ "special": true
1683
+ },
1684
+ "128210": {
1685
+ "content": "<|reserved_special_token_202|>",
1686
+ "lstrip": false,
1687
+ "normalized": false,
1688
+ "rstrip": false,
1689
+ "single_word": false,
1690
+ "special": true
1691
+ },
1692
+ "128211": {
1693
+ "content": "<|reserved_special_token_203|>",
1694
+ "lstrip": false,
1695
+ "normalized": false,
1696
+ "rstrip": false,
1697
+ "single_word": false,
1698
+ "special": true
1699
+ },
1700
+ "128212": {
1701
+ "content": "<|reserved_special_token_204|>",
1702
+ "lstrip": false,
1703
+ "normalized": false,
1704
+ "rstrip": false,
1705
+ "single_word": false,
1706
+ "special": true
1707
+ },
1708
+ "128213": {
1709
+ "content": "<|reserved_special_token_205|>",
1710
+ "lstrip": false,
1711
+ "normalized": false,
1712
+ "rstrip": false,
1713
+ "single_word": false,
1714
+ "special": true
1715
+ },
1716
+ "128214": {
1717
+ "content": "<|reserved_special_token_206|>",
1718
+ "lstrip": false,
1719
+ "normalized": false,
1720
+ "rstrip": false,
1721
+ "single_word": false,
1722
+ "special": true
1723
+ },
1724
+ "128215": {
1725
+ "content": "<|reserved_special_token_207|>",
1726
+ "lstrip": false,
1727
+ "normalized": false,
1728
+ "rstrip": false,
1729
+ "single_word": false,
1730
+ "special": true
1731
+ },
1732
+ "128216": {
1733
+ "content": "<|reserved_special_token_208|>",
1734
+ "lstrip": false,
1735
+ "normalized": false,
1736
+ "rstrip": false,
1737
+ "single_word": false,
1738
+ "special": true
1739
+ },
1740
+ "128217": {
1741
+ "content": "<|reserved_special_token_209|>",
1742
+ "lstrip": false,
1743
+ "normalized": false,
1744
+ "rstrip": false,
1745
+ "single_word": false,
1746
+ "special": true
1747
+ },
1748
+ "128218": {
1749
+ "content": "<|reserved_special_token_210|>",
1750
+ "lstrip": false,
1751
+ "normalized": false,
1752
+ "rstrip": false,
1753
+ "single_word": false,
1754
+ "special": true
1755
+ },
1756
+ "128219": {
1757
+ "content": "<|reserved_special_token_211|>",
1758
+ "lstrip": false,
1759
+ "normalized": false,
1760
+ "rstrip": false,
1761
+ "single_word": false,
1762
+ "special": true
1763
+ },
1764
+ "128220": {
1765
+ "content": "<|reserved_special_token_212|>",
1766
+ "lstrip": false,
1767
+ "normalized": false,
1768
+ "rstrip": false,
1769
+ "single_word": false,
1770
+ "special": true
1771
+ },
1772
+ "128221": {
1773
+ "content": "<|reserved_special_token_213|>",
1774
+ "lstrip": false,
1775
+ "normalized": false,
1776
+ "rstrip": false,
1777
+ "single_word": false,
1778
+ "special": true
1779
+ },
1780
+ "128222": {
1781
+ "content": "<|reserved_special_token_214|>",
1782
+ "lstrip": false,
1783
+ "normalized": false,
1784
+ "rstrip": false,
1785
+ "single_word": false,
1786
+ "special": true
1787
+ },
1788
+ "128223": {
1789
+ "content": "<|reserved_special_token_215|>",
1790
+ "lstrip": false,
1791
+ "normalized": false,
1792
+ "rstrip": false,
1793
+ "single_word": false,
1794
+ "special": true
1795
+ },
1796
+ "128224": {
1797
+ "content": "<|reserved_special_token_216|>",
1798
+ "lstrip": false,
1799
+ "normalized": false,
1800
+ "rstrip": false,
1801
+ "single_word": false,
1802
+ "special": true
1803
+ },
1804
+ "128225": {
1805
+ "content": "<|reserved_special_token_217|>",
1806
+ "lstrip": false,
1807
+ "normalized": false,
1808
+ "rstrip": false,
1809
+ "single_word": false,
1810
+ "special": true
1811
+ },
1812
+ "128226": {
1813
+ "content": "<|reserved_special_token_218|>",
1814
+ "lstrip": false,
1815
+ "normalized": false,
1816
+ "rstrip": false,
1817
+ "single_word": false,
1818
+ "special": true
1819
+ },
1820
+ "128227": {
1821
+ "content": "<|reserved_special_token_219|>",
1822
+ "lstrip": false,
1823
+ "normalized": false,
1824
+ "rstrip": false,
1825
+ "single_word": false,
1826
+ "special": true
1827
+ },
1828
+ "128228": {
1829
+ "content": "<|reserved_special_token_220|>",
1830
+ "lstrip": false,
1831
+ "normalized": false,
1832
+ "rstrip": false,
1833
+ "single_word": false,
1834
+ "special": true
1835
+ },
1836
+ "128229": {
1837
+ "content": "<|reserved_special_token_221|>",
1838
+ "lstrip": false,
1839
+ "normalized": false,
1840
+ "rstrip": false,
1841
+ "single_word": false,
1842
+ "special": true
1843
+ },
1844
+ "128230": {
1845
+ "content": "<|reserved_special_token_222|>",
1846
+ "lstrip": false,
1847
+ "normalized": false,
1848
+ "rstrip": false,
1849
+ "single_word": false,
1850
+ "special": true
1851
+ },
1852
+ "128231": {
1853
+ "content": "<|reserved_special_token_223|>",
1854
+ "lstrip": false,
1855
+ "normalized": false,
1856
+ "rstrip": false,
1857
+ "single_word": false,
1858
+ "special": true
1859
+ },
1860
+ "128232": {
1861
+ "content": "<|reserved_special_token_224|>",
1862
+ "lstrip": false,
1863
+ "normalized": false,
1864
+ "rstrip": false,
1865
+ "single_word": false,
1866
+ "special": true
1867
+ },
1868
+ "128233": {
1869
+ "content": "<|reserved_special_token_225|>",
1870
+ "lstrip": false,
1871
+ "normalized": false,
1872
+ "rstrip": false,
1873
+ "single_word": false,
1874
+ "special": true
1875
+ },
1876
+ "128234": {
1877
+ "content": "<|reserved_special_token_226|>",
1878
+ "lstrip": false,
1879
+ "normalized": false,
1880
+ "rstrip": false,
1881
+ "single_word": false,
1882
+ "special": true
1883
+ },
1884
+ "128235": {
1885
+ "content": "<|reserved_special_token_227|>",
1886
+ "lstrip": false,
1887
+ "normalized": false,
1888
+ "rstrip": false,
1889
+ "single_word": false,
1890
+ "special": true
1891
+ },
1892
+ "128236": {
1893
+ "content": "<|reserved_special_token_228|>",
1894
+ "lstrip": false,
1895
+ "normalized": false,
1896
+ "rstrip": false,
1897
+ "single_word": false,
1898
+ "special": true
1899
+ },
1900
+ "128237": {
1901
+ "content": "<|reserved_special_token_229|>",
1902
+ "lstrip": false,
1903
+ "normalized": false,
1904
+ "rstrip": false,
1905
+ "single_word": false,
1906
+ "special": true
1907
+ },
1908
+ "128238": {
1909
+ "content": "<|reserved_special_token_230|>",
1910
+ "lstrip": false,
1911
+ "normalized": false,
1912
+ "rstrip": false,
1913
+ "single_word": false,
1914
+ "special": true
1915
+ },
1916
+ "128239": {
1917
+ "content": "<|reserved_special_token_231|>",
1918
+ "lstrip": false,
1919
+ "normalized": false,
1920
+ "rstrip": false,
1921
+ "single_word": false,
1922
+ "special": true
1923
+ },
1924
+ "128240": {
1925
+ "content": "<|reserved_special_token_232|>",
1926
+ "lstrip": false,
1927
+ "normalized": false,
1928
+ "rstrip": false,
1929
+ "single_word": false,
1930
+ "special": true
1931
+ },
1932
+ "128241": {
1933
+ "content": "<|reserved_special_token_233|>",
1934
+ "lstrip": false,
1935
+ "normalized": false,
1936
+ "rstrip": false,
1937
+ "single_word": false,
1938
+ "special": true
1939
+ },
1940
+ "128242": {
1941
+ "content": "<|reserved_special_token_234|>",
1942
+ "lstrip": false,
1943
+ "normalized": false,
1944
+ "rstrip": false,
1945
+ "single_word": false,
1946
+ "special": true
1947
+ },
1948
+ "128243": {
1949
+ "content": "<|reserved_special_token_235|>",
1950
+ "lstrip": false,
1951
+ "normalized": false,
1952
+ "rstrip": false,
1953
+ "single_word": false,
1954
+ "special": true
1955
+ },
1956
+ "128244": {
1957
+ "content": "<|reserved_special_token_236|>",
1958
+ "lstrip": false,
1959
+ "normalized": false,
1960
+ "rstrip": false,
1961
+ "single_word": false,
1962
+ "special": true
1963
+ },
1964
+ "128245": {
1965
+ "content": "<|reserved_special_token_237|>",
1966
+ "lstrip": false,
1967
+ "normalized": false,
1968
+ "rstrip": false,
1969
+ "single_word": false,
1970
+ "special": true
1971
+ },
1972
+ "128246": {
1973
+ "content": "<|reserved_special_token_238|>",
1974
+ "lstrip": false,
1975
+ "normalized": false,
1976
+ "rstrip": false,
1977
+ "single_word": false,
1978
+ "special": true
1979
+ },
1980
+ "128247": {
1981
+ "content": "<|reserved_special_token_239|>",
1982
+ "lstrip": false,
1983
+ "normalized": false,
1984
+ "rstrip": false,
1985
+ "single_word": false,
1986
+ "special": true
1987
+ },
1988
+ "128248": {
1989
+ "content": "<|reserved_special_token_240|>",
1990
+ "lstrip": false,
1991
+ "normalized": false,
1992
+ "rstrip": false,
1993
+ "single_word": false,
1994
+ "special": true
1995
+ },
1996
+ "128249": {
1997
+ "content": "<|reserved_special_token_241|>",
1998
+ "lstrip": false,
1999
+ "normalized": false,
2000
+ "rstrip": false,
2001
+ "single_word": false,
2002
+ "special": true
2003
+ },
2004
+ "128250": {
2005
+ "content": "<|reserved_special_token_242|>",
2006
+ "lstrip": false,
2007
+ "normalized": false,
2008
+ "rstrip": false,
2009
+ "single_word": false,
2010
+ "special": true
2011
+ },
2012
+ "128251": {
2013
+ "content": "<|reserved_special_token_243|>",
2014
+ "lstrip": false,
2015
+ "normalized": false,
2016
+ "rstrip": false,
2017
+ "single_word": false,
2018
+ "special": true
2019
+ },
2020
+ "128252": {
2021
+ "content": "<|reserved_special_token_244|>",
2022
+ "lstrip": false,
2023
+ "normalized": false,
2024
+ "rstrip": false,
2025
+ "single_word": false,
2026
+ "special": true
2027
+ },
2028
+ "128253": {
2029
+ "content": "<|reserved_special_token_245|>",
2030
+ "lstrip": false,
2031
+ "normalized": false,
2032
+ "rstrip": false,
2033
+ "single_word": false,
2034
+ "special": true
2035
+ },
2036
+ "128254": {
2037
+ "content": "<|reserved_special_token_246|>",
2038
+ "lstrip": false,
2039
+ "normalized": false,
2040
+ "rstrip": false,
2041
+ "single_word": false,
2042
+ "special": true
2043
+ },
2044
+ "128255": {
2045
+ "content": "<|reserved_special_token_247|>",
2046
+ "lstrip": false,
2047
+ "normalized": false,
2048
+ "rstrip": false,
2049
+ "single_word": false,
2050
+ "special": true
2051
+ }
2052
+ },
2053
+ "bos_token": "<|begin_of_text|>",
2054
+ "clean_up_tokenization_spaces": true,
2055
+ "eos_token": "<|end_of_text|>",
2056
+ "extra_special_tokens": {},
2057
+ "model_input_names": [
2058
+ "input_ids",
2059
+ "attention_mask"
2060
+ ],
2061
+ "model_max_length": 131072,
2062
+ "pad_token": "<|finetune_right_pad_id|>",
2063
+ "padding_side": "left",
2064
+ "tokenizer_class": "PreTrainedTokenizer",
2065
+ "unk_token": null
2066
+ }