Mistral-Small-3.2-24B-Instruct-2506-pruned-GGUF / scores /Mistral-Small-3.2-24B-Instruct-2506-pruned-Q6_K.md
eaddario's picture
Add GGUF internal file structure
650665a verified

Mistral-Small-3.2-24B-Instruct-pruned-Q6_K.gguf - GGUF Internal File Dump

  • Endian: LITTLE endian

Key Value Metadata Store

There are 49 key-value pairs in this file

POS TYPE Count Key Value
1 UINT32 1 GGUF.version 3
2 UINT64 1 GGUF.tensor_count 345
3 UINT64 1 GGUF.kv_count 46
4 STRING 1 general.architecture llama
5 STRING 1 general.type model
6 STRING 1 general.name Mistral Small 3.2 24B Instruct 2506
7 STRING 1 general.version 2506
8 STRING 1 general.finetune Instruct
9 STRING 1 general.basename Mistral-Small-3.2
10 STRING 1 general.size_label 24B
11 STRING 1 general.license apache-2.0
12 UINT32 1 general.base_model.count 1
13 STRING 1 general.base_model.0.name Mistral Small 3.1 24B Base 2503
14 STRING 1 general.base_model.0.version 2503
15 STRING 1 general.base_model.0.organization Mistralai
16 STRING 1 general.base_model.0.repo_url https://huggingface.co/mistral...istral-Small-3.1-24B-Base-2503
17 [STRING] 1 general.tags [ image-text-to-text ]
18 [STRING] 24 general.languages [ en, fr, de, es, pt, ... ]
19 UINT32 1 llama.context_length 131072
20 UINT32 1 llama.embedding_length 5120
21 UINT32 1 llama.feed_forward_length 32768
22 UINT32 1 llama.attention.head_count 32
23 UINT32 1 llama.attention.head_count_kv 8
24 FLOAT32 1 llama.rope.freq_base 1000000000.0
25 FLOAT32 1 llama.attention.layer_norm_rms_epsilon 1e-05
26 UINT32 1 llama.attention.key_length 128
27 UINT32 1 llama.attention.value_length 128
28 UINT32 1 llama.vocab_size 131072
29 UINT32 1 llama.rope.dimension_count 128
30 STRING 1 tokenizer.ggml.model gpt2
31 STRING 1 tokenizer.ggml.pre tekken
32 [STRING] 131072 tokenizer.ggml.tokens [ <unk>, <s>, </s>, [INST], [/INST], ... ]
33 [INT32] 131072 tokenizer.ggml.token_type [ 3, 3, 3, 3, 3, 3, 3, ... ]
34 [STRING] 269443 tokenizer.ggml.merges [ Ġ Ġ, Ġ t, e r, i n, Ġ ĠĠĠ, ... ]
35 UINT32 1 tokenizer.ggml.bos_token_id 1
36 UINT32 1 tokenizer.ggml.eos_token_id 2
37 UINT32 1 tokenizer.ggml.unknown_token_id 0
38 UINT32 1 tokenizer.ggml.padding_token_id 11
39 BOOL 1 tokenizer.ggml.add_bos_token True
40 BOOL 1 tokenizer.ggml.add_sep_token False
41 BOOL 1 tokenizer.ggml.add_eos_token False
42 BOOL 1 tokenizer.ggml.add_space_prefix False
43 UINT32 1 llama.block_count 38
44 UINT32 1 general.quantization_version 2
45 UINT32 1 general.file_type 18
46 STRING 1 quantize.imatrix.file ./imatrix/imatrix-Mistral-Smal...24B-Instruct-pruned-medium.dat
47 STRING 1 quantize.imatrix.dataset ../../datasets/imatrix/text_eur_medium.txt
48 UINT32 1 quantize.imatrix.entries_count 266
49 UINT32 1 quantize.imatrix.chunks_count 1778

Tensors Overview ~22B Elements

Total number of elements in all tensors: 22460892160 Elements

Tensor Data Offset

This table contains the offset and data segment relative to start of file

T_ID Tensor Layer Name Data Offset (B) Data Size (B)
0 output.weight 0x7841e0 0x20d00000
1 output_norm.weight 0x214841e0 0x5000
2 token_embd.weight 0x214891e0 0x11300000
3 blk.0.attn_k.weight 0x327891e0 0x41a000
4 blk.0.attn_norm.weight 0x32ba31e0 0x5000
5 blk.0.attn_output.weight 0x32ba81e0 0x1068000
6 blk.0.attn_q.weight 0x33c101e0 0x1068000
7 blk.0.attn_v.weight 0x34c781e0 0x550000
8 blk.0.ffn_down.weight 0x351c81e0 0xaa00000
9 blk.0.ffn_gate.weight 0x3fbc81e0 0x8340000
10 blk.0.ffn_norm.weight 0x47f081e0 0x5000
11 blk.0.ffn_up.weight 0x47f0d1e0 0x8340000
12 blk.1.attn_k.weight 0x5024d1e0 0x41a000
13 blk.1.attn_norm.weight 0x506671e0 0x5000
14 blk.1.attn_output.weight 0x5066c1e0 0x1068000
15 blk.1.attn_q.weight 0x516d41e0 0x1068000
16 blk.1.attn_v.weight 0x5273c1e0 0x550000
17 blk.1.ffn_down.weight 0x52c8c1e0 0xaa00000
18 blk.1.ffn_gate.weight 0x5d68c1e0 0x8340000
19 blk.1.ffn_norm.weight 0x659cc1e0 0x5000
20 blk.1.ffn_up.weight 0x659d11e0 0x8340000
21 blk.2.attn_k.weight 0x6dd111e0 0x41a000
22 blk.2.attn_norm.weight 0x6e12b1e0 0x5000
23 blk.2.attn_output.weight 0x6e1301e0 0x1068000
24 blk.2.attn_q.weight 0x6f1981e0 0x1068000
25 blk.2.attn_v.weight 0x702001e0 0x550000
26 blk.2.ffn_down.weight 0x707501e0 0xaa00000
27 blk.2.ffn_gate.weight 0x7b1501e0 0x8340000
28 blk.2.ffn_norm.weight 0x834901e0 0x5000
29 blk.2.ffn_up.weight 0x834951e0 0x8340000
30 blk.3.attn_k.weight 0x8b7d51e0 0x41a000
31 blk.3.attn_norm.weight 0x8bbef1e0 0x5000
32 blk.3.attn_output.weight 0x8bbf41e0 0x1068000
33 blk.3.attn_q.weight 0x8cc5c1e0 0x1068000
34 blk.3.attn_v.weight 0x8dcc41e0 0x550000
35 blk.3.ffn_down.weight 0x8e2141e0 0xaa00000
36 blk.3.ffn_gate.weight 0x98c141e0 0x8340000
37 blk.3.ffn_norm.weight 0xa0f541e0 0x5000
38 blk.3.ffn_up.weight 0xa0f591e0 0x8340000
39 blk.4.attn_k.weight 0xa92991e0 0x41a000
40 blk.4.attn_norm.weight 0xa96b31e0 0x5000
41 blk.4.attn_output.weight 0xa96b81e0 0x1068000
42 blk.4.attn_q.weight 0xaa7201e0 0x1068000
43 blk.4.attn_v.weight 0xab7881e0 0x550000
44 blk.4.ffn_down.weight 0xabcd81e0 0xaa00000
45 blk.4.ffn_gate.weight 0xb66d81e0 0x8340000
46 blk.4.ffn_norm.weight 0xbea181e0 0x5000
47 blk.4.ffn_up.weight 0xbea1d1e0 0x8340000
48 blk.5.attn_k.weight 0xc6d5d1e0 0x41a000
49 blk.5.attn_norm.weight 0xc71771e0 0x5000
50 blk.5.attn_output.weight 0xc717c1e0 0x1068000
51 blk.5.attn_q.weight 0xc81e41e0 0x1068000
52 blk.5.attn_v.weight 0xc924c1e0 0x550000
53 blk.5.ffn_down.weight 0xc979c1e0 0xaa00000
54 blk.5.ffn_gate.weight 0xd419c1e0 0x8340000
55 blk.5.ffn_norm.weight 0xdc4dc1e0 0x5000
56 blk.5.ffn_up.weight 0xdc4e11e0 0x8340000
57 blk.6.attn_k.weight 0xe48211e0 0x41a000
58 blk.6.attn_norm.weight 0xe4c3b1e0 0x5000
59 blk.6.attn_output.weight 0xe4c401e0 0x1068000
60 blk.6.attn_q.weight 0xe5ca81e0 0x1068000
61 blk.6.attn_v.weight 0xe6d101e0 0x550000
62 blk.6.ffn_down.weight 0xe72601e0 0xaa00000
63 blk.6.ffn_gate.weight 0xf1c601e0 0x8340000
64 blk.6.ffn_norm.weight 0xf9fa01e0 0x5000
65 blk.6.ffn_up.weight 0xf9fa51e0 0x8340000
66 blk.7.attn_k.weight 0x1022e51e0 0x41a000
67 blk.7.attn_norm.weight 0x1026ff1e0 0x5000
68 blk.7.attn_output.weight 0x1027041e0 0x1068000
69 blk.7.attn_q.weight 0x10376c1e0 0x1068000
70 blk.7.attn_v.weight 0x1047d41e0 0x550000
71 blk.7.ffn_down.weight 0x104d241e0 0xaa00000
72 blk.7.ffn_gate.weight 0x10f7241e0 0x8340000
73 blk.7.ffn_norm.weight 0x117a641e0 0x5000
74 blk.7.ffn_up.weight 0x117a691e0 0x8340000
75 blk.8.attn_k.weight 0x11fda91e0 0x41a000
76 blk.8.attn_norm.weight 0x1201c31e0 0x5000
77 blk.8.attn_output.weight 0x1201c81e0 0x1068000
78 blk.8.attn_q.weight 0x1212301e0 0x1068000
79 blk.8.attn_v.weight 0x1222981e0 0x550000
80 blk.8.ffn_down.weight 0x1227e81e0 0xaa00000
81 blk.8.ffn_gate.weight 0x12d1e81e0 0x8340000
82 blk.8.ffn_norm.weight 0x1355281e0 0x5000
83 blk.8.ffn_up.weight 0x13552d1e0 0x8340000
84 blk.9.attn_k.weight 0x13d86d1e0 0x41a000
85 blk.9.attn_norm.weight 0x13dc871e0 0x5000
86 blk.9.attn_output.weight 0x13dc8c1e0 0x1068000
87 blk.9.attn_q.weight 0x13ecf41e0 0x1068000
88 blk.9.attn_v.weight 0x13fd5c1e0 0x550000
89 blk.9.ffn_down.weight 0x1402ac1e0 0xaa00000
90 blk.9.ffn_gate.weight 0x14acac1e0 0x8340000
91 blk.9.ffn_norm.weight 0x152fec1e0 0x5000
92 blk.9.ffn_up.weight 0x152ff11e0 0x8340000
93 blk.10.attn_k.weight 0x15b3311e0 0x41a000
94 blk.10.attn_norm.weight 0x15b74b1e0 0x5000
95 blk.10.attn_output.weight 0x15b7501e0 0x1068000
96 blk.10.attn_q.weight 0x15c7b81e0 0x1068000
97 blk.10.attn_v.weight 0x15d8201e0 0x550000
98 blk.10.ffn_down.weight 0x15dd701e0 0xaa00000
99 blk.10.ffn_gate.weight 0x1687701e0 0x8340000
100 blk.10.ffn_norm.weight 0x170ab01e0 0x5000
101 blk.10.ffn_up.weight 0x170ab51e0 0x8340000
102 blk.11.attn_k.weight 0x178df51e0 0x41a000
103 blk.11.attn_norm.weight 0x17920f1e0 0x5000
104 blk.11.attn_output.weight 0x1792141e0 0x1068000
105 blk.11.attn_q.weight 0x17a27c1e0 0x1068000
106 blk.11.attn_v.weight 0x17b2e41e0 0x550000
107 blk.11.ffn_down.weight 0x17b8341e0 0xaa00000
108 blk.11.ffn_gate.weight 0x1862341e0 0x8340000
109 blk.11.ffn_norm.weight 0x18e5741e0 0x5000
110 blk.11.ffn_up.weight 0x18e5791e0 0x8340000
111 blk.12.attn_k.weight 0x1968b91e0 0x41a000
112 blk.12.attn_norm.weight 0x196cd31e0 0x5000
113 blk.12.attn_output.weight 0x196cd81e0 0x1068000
114 blk.12.attn_q.weight 0x197d401e0 0x1068000
115 blk.12.attn_v.weight 0x198da81e0 0x550000
116 blk.12.ffn_down.weight 0x1992f81e0 0xaa00000
117 blk.12.ffn_gate.weight 0x1a3cf81e0 0x8340000
118 blk.12.ffn_norm.weight 0x1ac0381e0 0x5000
119 blk.12.ffn_up.weight 0x1ac03d1e0 0x8340000
120 blk.13.attn_k.weight 0x1b437d1e0 0x41a000
121 blk.13.attn_norm.weight 0x1b47971e0 0x5000
122 blk.13.attn_output.weight 0x1b479c1e0 0x1068000
123 blk.13.attn_q.weight 0x1b58041e0 0x1068000
124 blk.13.attn_v.weight 0x1b686c1e0 0x550000
125 blk.13.ffn_down.weight 0x1b6dbc1e0 0xaa00000
126 blk.13.ffn_gate.weight 0x1c17bc1e0 0x8340000
127 blk.13.ffn_norm.weight 0x1c9afc1e0 0x5000
128 blk.13.ffn_up.weight 0x1c9b011e0 0x8340000
129 blk.14.attn_k.weight 0x1d1e411e0 0x41a000
130 blk.14.attn_norm.weight 0x1d225b1e0 0x5000
131 blk.14.attn_output.weight 0x1d22601e0 0x1068000
132 blk.14.attn_q.weight 0x1d32c81e0 0x1068000
133 blk.14.attn_v.weight 0x1d43301e0 0x550000
134 blk.14.ffn_down.weight 0x1d48801e0 0xaa00000
135 blk.14.ffn_gate.weight 0x1df2801e0 0x8340000
136 blk.14.ffn_norm.weight 0x1e75c01e0 0x5000
137 blk.14.ffn_up.weight 0x1e75c51e0 0x8340000
138 blk.15.attn_k.weight 0x1ef9051e0 0x41a000
139 blk.15.attn_norm.weight 0x1efd1f1e0 0x5000
140 blk.15.attn_output.weight 0x1efd241e0 0x1068000
141 blk.15.attn_q.weight 0x1f0d8c1e0 0x1068000
142 blk.15.attn_v.weight 0x1f1df41e0 0x550000
143 blk.15.ffn_down.weight 0x1f23441e0 0xaa00000
144 blk.15.ffn_gate.weight 0x1fcd441e0 0x8340000
145 blk.15.ffn_norm.weight 0x2050841e0 0x5000
146 blk.15.ffn_up.weight 0x2050891e0 0x8340000
147 blk.16.attn_k.weight 0x20d3c91e0 0x41a000
148 blk.16.attn_norm.weight 0x20d7e31e0 0x5000
149 blk.16.attn_output.weight 0x20d7e81e0 0x1068000
150 blk.16.attn_q.weight 0x20e8501e0 0x1068000
151 blk.16.attn_v.weight 0x20f8b81e0 0x550000
152 blk.16.ffn_down.weight 0x20fe081e0 0xaa00000
153 blk.16.ffn_gate.weight 0x21a8081e0 0x8340000
154 blk.16.ffn_norm.weight 0x222b481e0 0x5000
155 blk.16.ffn_up.weight 0x222b4d1e0 0x8340000
156 blk.17.attn_k.weight 0x22ae8d1e0 0x370000
157 blk.17.attn_norm.weight 0x22b1fd1e0 0x5000
158 blk.17.attn_output.weight 0x22b2021e0 0x1068000
159 blk.17.attn_q.weight 0x22c26a1e0 0xdc0000
160 blk.17.attn_v.weight 0x22d02a1e0 0x41a000
161 blk.17.ffn_down.weight 0x22d4441e0 0xaa00000
162 blk.17.ffn_gate.weight 0x237e441e0 0x8340000
163 blk.17.ffn_norm.weight 0x2401841e0 0x5000
164 blk.17.ffn_up.weight 0x2401891e0 0x8340000
165 blk.18.attn_k.weight 0x2484c91e0 0x370000
166 blk.18.attn_norm.weight 0x2488391e0 0x5000
167 blk.18.attn_output.weight 0x24883e1e0 0x1068000
168 blk.18.attn_q.weight 0x2498a61e0 0xdc0000
169 blk.18.attn_v.weight 0x24a6661e0 0x41a000
170 blk.18.ffn_down.weight 0x24aa801e0 0xaa00000
171 blk.18.ffn_gate.weight 0x2554801e0 0x8340000
172 blk.18.ffn_norm.weight 0x25d7c01e0 0x5000
173 blk.18.ffn_up.weight 0x25d7c51e0 0x8340000
174 blk.19.attn_k.weight 0x265b051e0 0x41a000
175 blk.19.attn_norm.weight 0x265f1f1e0 0x5000
176 blk.19.attn_output.weight 0x265f241e0 0x1068000
177 blk.19.attn_q.weight 0x266f8c1e0 0x1068000
178 blk.19.attn_v.weight 0x267ff41e0 0x550000
179 blk.19.ffn_down.weight 0x2685441e0 0xaa00000
180 blk.19.ffn_gate.weight 0x272f441e0 0x8340000
181 blk.19.ffn_norm.weight 0x27b2841e0 0x5000
182 blk.19.ffn_up.weight 0x27b2891e0 0x8340000
183 blk.20.attn_k.weight 0x2835c91e0 0x370000
184 blk.20.attn_norm.weight 0x2839391e0 0x5000
185 blk.20.attn_output.weight 0x28393e1e0 0x1068000
186 blk.20.attn_q.weight 0x2849a61e0 0xdc0000
187 blk.20.attn_v.weight 0x2857661e0 0x41a000
188 blk.20.ffn_down.weight 0x285b801e0 0xaa00000
189 blk.20.ffn_gate.weight 0x2905801e0 0x6e00000
190 blk.20.ffn_norm.weight 0x2973801e0 0x5000
191 blk.20.ffn_up.weight 0x2973851e0 0x6e00000
192 blk.21.attn_k.weight 0x29e1851e0 0x41a000
193 blk.21.attn_norm.weight 0x29e59f1e0 0x5000
194 blk.21.attn_output.weight 0x29e5a41e0 0x1068000
195 blk.21.attn_q.weight 0x29f60c1e0 0x1068000
196 blk.21.attn_v.weight 0x2a06741e0 0x550000
197 blk.21.ffn_down.weight 0x2a0bc41e0 0xaa00000
198 blk.21.ffn_gate.weight 0x2ab5c41e0 0x6e00000
199 blk.21.ffn_norm.weight 0x2b23c41e0 0x5000
200 blk.21.ffn_up.weight 0x2b23c91e0 0x6e00000
201 blk.22.attn_k.weight 0x2b91c91e0 0x370000
202 blk.22.attn_norm.weight 0x2b95391e0 0x5000
203 blk.22.attn_output.weight 0x2b953e1e0 0x1068000
204 blk.22.attn_q.weight 0x2ba5a61e0 0xdc0000
205 blk.22.attn_v.weight 0x2bb3661e0 0x41a000
206 blk.22.ffn_down.weight 0x2bb7801e0 0xaa00000
207 blk.22.ffn_gate.weight 0x2c61801e0 0x6e00000
208 blk.22.ffn_norm.weight 0x2ccf801e0 0x5000
209 blk.22.ffn_up.weight 0x2ccf851e0 0x6e00000
210 blk.23.attn_k.weight 0x2d3d851e0 0x370000
211 blk.23.attn_norm.weight 0x2d40f51e0 0x5000
212 blk.23.attn_output.weight 0x2d40fa1e0 0x1068000
213 blk.23.attn_q.weight 0x2d51621e0 0xdc0000
214 blk.23.attn_v.weight 0x2d5f221e0 0x41a000
215 blk.23.ffn_down.weight 0x2d633c1e0 0xaa00000
216 blk.23.ffn_gate.weight 0x2e0d3c1e0 0x6e00000
217 blk.23.ffn_norm.weight 0x2e7b3c1e0 0x5000
218 blk.23.ffn_up.weight 0x2e7b411e0 0x6e00000
219 blk.24.attn_k.weight 0x2ee9411e0 0x370000
220 blk.24.attn_norm.weight 0x2eecb11e0 0x5000
221 blk.24.attn_output.weight 0x2eecb61e0 0x1068000
222 blk.24.attn_q.weight 0x2efd1e1e0 0xdc0000
223 blk.24.attn_v.weight 0x2f0ade1e0 0x41a000
224 blk.24.ffn_down.weight 0x2f0ef81e0 0xaa00000
225 blk.24.ffn_gate.weight 0x2fb8f81e0 0x6e00000
226 blk.24.ffn_norm.weight 0x3026f81e0 0x5000
227 blk.24.ffn_up.weight 0x3026fd1e0 0x6e00000
228 blk.25.attn_k.weight 0x3094fd1e0 0x370000
229 blk.25.attn_norm.weight 0x30986d1e0 0x5000
230 blk.25.attn_output.weight 0x3098721e0 0x1068000
231 blk.25.attn_q.weight 0x30a8da1e0 0xdc0000
232 blk.25.attn_v.weight 0x30b69a1e0 0x41a000
233 blk.25.ffn_down.weight 0x30bab41e0 0xaa00000
234 blk.25.ffn_gate.weight 0x3164b41e0 0x6e00000
235 blk.25.ffn_norm.weight 0x31d2b41e0 0x5000
236 blk.25.ffn_up.weight 0x31d2b91e0 0x6e00000
237 blk.26.attn_k.weight 0x3240b91e0 0x370000
238 blk.26.attn_norm.weight 0x3244291e0 0x5000
239 blk.26.attn_output.weight 0x32442e1e0 0x1068000
240 blk.26.attn_q.weight 0x3254961e0 0xdc0000
241 blk.26.attn_v.weight 0x3262561e0 0x41a000
242 blk.26.ffn_down.weight 0x3266701e0 0xaa00000
243 blk.26.ffn_gate.weight 0x3310701e0 0x6e00000
244 blk.26.ffn_norm.weight 0x337e701e0 0x5000
245 blk.26.ffn_up.weight 0x337e751e0 0x6e00000
246 blk.27.attn_k.weight 0x33ec751e0 0x41a000
247 blk.27.attn_norm.weight 0x33f08f1e0 0x5000
248 blk.27.attn_output.weight 0x33f0941e0 0x1068000
249 blk.27.attn_q.weight 0x3400fc1e0 0x1068000
250 blk.27.attn_v.weight 0x3411641e0 0x550000
251 blk.27.ffn_down.weight 0x3416b41e0 0xaa00000
252 blk.27.ffn_gate.weight 0x34c0b41e0 0x6e00000
253 blk.27.ffn_norm.weight 0x352eb41e0 0x5000
254 blk.27.ffn_up.weight 0x352eb91e0 0x6e00000
255 blk.28.attn_k.weight 0x359cb91e0 0x370000
256 blk.28.attn_norm.weight 0x35a0291e0 0x5000
257 blk.28.attn_output.weight 0x35a02e1e0 0x1068000
258 blk.28.attn_q.weight 0x35b0961e0 0xdc0000
259 blk.28.attn_v.weight 0x35be561e0 0x41a000
260 blk.28.ffn_down.weight 0x35c2701e0 0xaa00000
261 blk.28.ffn_gate.weight 0x366c701e0 0x6e00000
262 blk.28.ffn_norm.weight 0x36da701e0 0x5000
263 blk.28.ffn_up.weight 0x36da751e0 0x6e00000
264 blk.29.attn_k.weight 0x3748751e0 0x370000
265 blk.29.attn_norm.weight 0x374be51e0 0x5000
266 blk.29.attn_output.weight 0x374bea1e0 0x1068000
267 blk.29.attn_q.weight 0x375c521e0 0xdc0000
268 blk.29.attn_v.weight 0x376a121e0 0x41a000
269 blk.29.ffn_down.weight 0x376e2c1e0 0xaa00000
270 blk.29.ffn_gate.weight 0x38182c1e0 0x6e00000
271 blk.29.ffn_norm.weight 0x38862c1e0 0x5000
272 blk.29.ffn_up.weight 0x3886311e0 0x6e00000
273 blk.30.attn_k.weight 0x38f4311e0 0x370000
274 blk.30.attn_norm.weight 0x38f7a11e0 0x5000
275 blk.30.attn_output.weight 0x38f7a61e0 0x1068000
276 blk.30.attn_q.weight 0x39080e1e0 0xdc0000
277 blk.30.attn_v.weight 0x3915ce1e0 0x41a000
278 blk.30.ffn_down.weight 0x3919e81e0 0xaa00000
279 blk.30.ffn_gate.weight 0x39c3e81e0 0x6e00000
280 blk.30.ffn_norm.weight 0x3a31e81e0 0x5000
281 blk.30.ffn_up.weight 0x3a31ed1e0 0x6e00000
282 blk.31.attn_k.weight 0x3a9fed1e0 0x370000
283 blk.31.attn_norm.weight 0x3aa35d1e0 0x5000
284 blk.31.attn_output.weight 0x3aa3621e0 0x1068000
285 blk.31.attn_q.weight 0x3ab3ca1e0 0xdc0000
286 blk.31.attn_v.weight 0x3ac18a1e0 0x41a000
287 blk.31.ffn_down.weight 0x3ac5a41e0 0xaa00000
288 blk.31.ffn_gate.weight 0x3b6fa41e0 0x6e00000
289 blk.31.ffn_norm.weight 0x3bdda41e0 0x5000
290 blk.31.ffn_up.weight 0x3bdda91e0 0x6e00000
291 blk.32.attn_k.weight 0x3c4ba91e0 0x370000
292 blk.32.attn_norm.weight 0x3c4f191e0 0x5000
293 blk.32.attn_output.weight 0x3c4f1e1e0 0x1068000
294 blk.32.attn_q.weight 0x3c5f861e0 0xdc0000
295 blk.32.attn_v.weight 0x3c6d461e0 0x41a000
296 blk.32.ffn_down.weight 0x3c71601e0 0xaa00000
297 blk.32.ffn_gate.weight 0x3d1b601e0 0x6e00000
298 blk.32.ffn_norm.weight 0x3d89601e0 0x5000
299 blk.32.ffn_up.weight 0x3d89651e0 0x6e00000
300 blk.33.attn_k.weight 0x3df7651e0 0x370000
301 blk.33.attn_norm.weight 0x3dfad51e0 0x5000
302 blk.33.attn_output.weight 0x3dfada1e0 0x1068000
303 blk.33.attn_q.weight 0x3e0b421e0 0xdc0000
304 blk.33.attn_v.weight 0x3e19021e0 0x41a000
305 blk.33.ffn_down.weight 0x3e1d1c1e0 0xaa00000
306 blk.33.ffn_gate.weight 0x3ec71c1e0 0x6e00000
307 blk.33.ffn_norm.weight 0x3f351c1e0 0x5000
308 blk.33.ffn_up.weight 0x3f35211e0 0x6e00000
309 blk.34.attn_k.weight 0x3fa3211e0 0x370000
310 blk.34.attn_norm.weight 0x3fa6911e0 0x5000
311 blk.34.attn_output.weight 0x3fa6961e0 0x1068000
312 blk.34.attn_q.weight 0x3fb6fe1e0 0xdc0000
313 blk.34.attn_v.weight 0x3fc4be1e0 0x41a000
314 blk.34.ffn_down.weight 0x3fc8d81e0 0xaa00000
315 blk.34.ffn_gate.weight 0x4072d81e0 0x6e00000
316 blk.34.ffn_norm.weight 0x40e0d81e0 0x5000
317 blk.34.ffn_up.weight 0x40e0dd1e0 0x6e00000
318 blk.35.attn_k.weight 0x414edd1e0 0x370000
319 blk.35.attn_norm.weight 0x41524d1e0 0x5000
320 blk.35.attn_output.weight 0x4152521e0 0x1068000
321 blk.35.attn_q.weight 0x4162ba1e0 0xdc0000
322 blk.35.attn_v.weight 0x41707a1e0 0x41a000
323 blk.35.ffn_down.weight 0x4174941e0 0xaa00000
324 blk.35.ffn_gate.weight 0x421e941e0 0x6e00000
325 blk.35.ffn_norm.weight 0x428c941e0 0x5000
326 blk.35.ffn_up.weight 0x428c991e0 0x6e00000
327 blk.36.attn_k.weight 0x42fa991e0 0x370000
328 blk.36.attn_norm.weight 0x42fe091e0 0x5000
329 blk.36.attn_output.weight 0x42fe0e1e0 0x1068000
330 blk.36.attn_q.weight 0x430e761e0 0xdc0000
331 blk.36.attn_v.weight 0x431c361e0 0x41a000
332 blk.36.ffn_down.weight 0x4320501e0 0xaa00000
333 blk.36.ffn_gate.weight 0x43ca501e0 0x6e00000
334 blk.36.ffn_norm.weight 0x4438501e0 0x5000
335 blk.36.ffn_up.weight 0x4438551e0 0x6e00000
336 blk.37.attn_k.weight 0x44a6551e0 0x370000
337 blk.37.attn_norm.weight 0x44a9c51e0 0x5000
338 blk.37.attn_output.weight 0x44a9ca1e0 0x1068000
339 blk.37.attn_q.weight 0x44ba321e0 0xdc0000
340 blk.37.attn_v.weight 0x44c7f21e0 0x41a000
341 blk.37.ffn_down.weight 0x44cc0c1e0 0xaa00000
342 blk.37.ffn_gate.weight 0x45760c1e0 0x6e00000
343 blk.37.ffn_norm.weight 0x45e40c1e0 0x5000
344 blk.37.ffn_up.weight 0x45e4111e0 0x6e00000

Base Tensor Group : ~1B Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type BPW
0 output.weight Output (W) (~671M) 671088640 5120 x 131072 x 1 x 1 Q6_K 6.5625
1 output_norm.weight Output Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
2 token_embd.weight Token Embedding (W) (~671M) 671088640 5120 x 131072 x 1 x 1 Q3_K 3.4375
  • Total elements in base: ( ~1B) 1342182400
  • Percentage of total elements: 5.98%
  • Bits per Weight (BPW) for base: 5.0001 bits

Block 0 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type BPW
3 blk.0.attn_k.weight Block 0 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q6_K 6.5625
4 blk.0.attn_norm.weight Block 0 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
5 blk.0.attn_output.weight Block 0 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q6_K 6.5625
6 blk.0.attn_q.weight Block 0 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q6_K 6.5625
7 blk.0.attn_v.weight Block 0 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q8_0 8.5000
8 blk.0.ffn_down.weight Block 0 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q8_0 8.5000
9 blk.0.ffn_gate.weight Block 0 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q6_K 6.5625
10 blk.0.ffn_norm.weight Block 0 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
11 blk.0.ffn_up.weight Block 0 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q6_K 6.5625
  • Total elements in blk.0: (~556M) 555755520
  • Percentage of total elements: 2.47%
  • Bits per Weight (BPW) for blk.0: 7.1661 bits

Block 1 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type BPW
12 blk.1.attn_k.weight Block 1 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q6_K 6.5625
13 blk.1.attn_norm.weight Block 1 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
14 blk.1.attn_output.weight Block 1 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q6_K 6.5625
15 blk.1.attn_q.weight Block 1 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q6_K 6.5625
16 blk.1.attn_v.weight Block 1 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q8_0 8.5000
17 blk.1.ffn_down.weight Block 1 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q8_0 8.5000
18 blk.1.ffn_gate.weight Block 1 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q6_K 6.5625
19 blk.1.ffn_norm.weight Block 1 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
20 blk.1.ffn_up.weight Block 1 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q6_K 6.5625
  • Total elements in blk.1: (~556M) 555755520
  • Percentage of total elements: 2.47%
  • Bits per Weight (BPW) for blk.1: 7.1661 bits

Block 2 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type BPW
21 blk.2.attn_k.weight Block 2 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q6_K 6.5625
22 blk.2.attn_norm.weight Block 2 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
23 blk.2.attn_output.weight Block 2 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q6_K 6.5625
24 blk.2.attn_q.weight Block 2 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q6_K 6.5625
25 blk.2.attn_v.weight Block 2 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q8_0 8.5000
26 blk.2.ffn_down.weight Block 2 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q8_0 8.5000
27 blk.2.ffn_gate.weight Block 2 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q6_K 6.5625
28 blk.2.ffn_norm.weight Block 2 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
29 blk.2.ffn_up.weight Block 2 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q6_K 6.5625
  • Total elements in blk.2: (~556M) 555755520
  • Percentage of total elements: 2.47%
  • Bits per Weight (BPW) for blk.2: 7.1661 bits

Block 3 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type BPW
30 blk.3.attn_k.weight Block 3 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q6_K 6.5625
31 blk.3.attn_norm.weight Block 3 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
32 blk.3.attn_output.weight Block 3 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q6_K 6.5625
33 blk.3.attn_q.weight Block 3 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q6_K 6.5625
34 blk.3.attn_v.weight Block 3 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q8_0 8.5000
35 blk.3.ffn_down.weight Block 3 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q8_0 8.5000
36 blk.3.ffn_gate.weight Block 3 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q6_K 6.5625
37 blk.3.ffn_norm.weight Block 3 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
38 blk.3.ffn_up.weight Block 3 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q6_K 6.5625
  • Total elements in blk.3: (~556M) 555755520
  • Percentage of total elements: 2.47%
  • Bits per Weight (BPW) for blk.3: 7.1661 bits

Block 4 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type BPW
39 blk.4.attn_k.weight Block 4 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q6_K 6.5625
40 blk.4.attn_norm.weight Block 4 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
41 blk.4.attn_output.weight Block 4 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q6_K 6.5625
42 blk.4.attn_q.weight Block 4 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q6_K 6.5625
43 blk.4.attn_v.weight Block 4 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q8_0 8.5000
44 blk.4.ffn_down.weight Block 4 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q8_0 8.5000
45 blk.4.ffn_gate.weight Block 4 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q6_K 6.5625
46 blk.4.ffn_norm.weight Block 4 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
47 blk.4.ffn_up.weight Block 4 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q6_K 6.5625
  • Total elements in blk.4: (~556M) 555755520
  • Percentage of total elements: 2.47%
  • Bits per Weight (BPW) for blk.4: 7.1661 bits

Block 5 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type BPW
48 blk.5.attn_k.weight Block 5 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q6_K 6.5625
49 blk.5.attn_norm.weight Block 5 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
50 blk.5.attn_output.weight Block 5 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q6_K 6.5625
51 blk.5.attn_q.weight Block 5 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q6_K 6.5625
52 blk.5.attn_v.weight Block 5 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q8_0 8.5000
53 blk.5.ffn_down.weight Block 5 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q8_0 8.5000
54 blk.5.ffn_gate.weight Block 5 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q6_K 6.5625
55 blk.5.ffn_norm.weight Block 5 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
56 blk.5.ffn_up.weight Block 5 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q6_K 6.5625
  • Total elements in blk.5: (~556M) 555755520
  • Percentage of total elements: 2.47%
  • Bits per Weight (BPW) for blk.5: 7.1661 bits

Block 6 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type BPW
57 blk.6.attn_k.weight Block 6 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q6_K 6.5625
58 blk.6.attn_norm.weight Block 6 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
59 blk.6.attn_output.weight Block 6 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q6_K 6.5625
60 blk.6.attn_q.weight Block 6 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q6_K 6.5625
61 blk.6.attn_v.weight Block 6 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q8_0 8.5000
62 blk.6.ffn_down.weight Block 6 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q8_0 8.5000
63 blk.6.ffn_gate.weight Block 6 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q6_K 6.5625
64 blk.6.ffn_norm.weight Block 6 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
65 blk.6.ffn_up.weight Block 6 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q6_K 6.5625
  • Total elements in blk.6: (~556M) 555755520
  • Percentage of total elements: 2.47%
  • Bits per Weight (BPW) for blk.6: 7.1661 bits

Block 7 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type BPW
66 blk.7.attn_k.weight Block 7 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q6_K 6.5625
67 blk.7.attn_norm.weight Block 7 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
68 blk.7.attn_output.weight Block 7 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q6_K 6.5625
69 blk.7.attn_q.weight Block 7 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q6_K 6.5625
70 blk.7.attn_v.weight Block 7 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q8_0 8.5000
71 blk.7.ffn_down.weight Block 7 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q8_0 8.5000
72 blk.7.ffn_gate.weight Block 7 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q6_K 6.5625
73 blk.7.ffn_norm.weight Block 7 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
74 blk.7.ffn_up.weight Block 7 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q6_K 6.5625
  • Total elements in blk.7: (~556M) 555755520
  • Percentage of total elements: 2.47%
  • Bits per Weight (BPW) for blk.7: 7.1661 bits

Block 8 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type BPW
75 blk.8.attn_k.weight Block 8 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q6_K 6.5625
76 blk.8.attn_norm.weight Block 8 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
77 blk.8.attn_output.weight Block 8 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q6_K 6.5625
78 blk.8.attn_q.weight Block 8 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q6_K 6.5625
79 blk.8.attn_v.weight Block 8 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q8_0 8.5000
80 blk.8.ffn_down.weight Block 8 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q8_0 8.5000
81 blk.8.ffn_gate.weight Block 8 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q6_K 6.5625
82 blk.8.ffn_norm.weight Block 8 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
83 blk.8.ffn_up.weight Block 8 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q6_K 6.5625
  • Total elements in blk.8: (~556M) 555755520
  • Percentage of total elements: 2.47%
  • Bits per Weight (BPW) for blk.8: 7.1661 bits

Block 9 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type BPW
84 blk.9.attn_k.weight Block 9 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q6_K 6.5625
85 blk.9.attn_norm.weight Block 9 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
86 blk.9.attn_output.weight Block 9 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q6_K 6.5625
87 blk.9.attn_q.weight Block 9 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q6_K 6.5625
88 blk.9.attn_v.weight Block 9 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q8_0 8.5000
89 blk.9.ffn_down.weight Block 9 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q8_0 8.5000
90 blk.9.ffn_gate.weight Block 9 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q6_K 6.5625
91 blk.9.ffn_norm.weight Block 9 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
92 blk.9.ffn_up.weight Block 9 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q6_K 6.5625
  • Total elements in blk.9: (~556M) 555755520
  • Percentage of total elements: 2.47%
  • Bits per Weight (BPW) for blk.9: 7.1661 bits

Block 10 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type BPW
93 blk.10.attn_k.weight Block 10 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q6_K 6.5625
94 blk.10.attn_norm.weight Block 10 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
95 blk.10.attn_output.weight Block 10 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q6_K 6.5625
96 blk.10.attn_q.weight Block 10 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q6_K 6.5625
97 blk.10.attn_v.weight Block 10 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q8_0 8.5000
98 blk.10.ffn_down.weight Block 10 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q8_0 8.5000
99 blk.10.ffn_gate.weight Block 10 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q6_K 6.5625
100 blk.10.ffn_norm.weight Block 10 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
101 blk.10.ffn_up.weight Block 10 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q6_K 6.5625
  • Total elements in blk.10: (~556M) 555755520
  • Percentage of total elements: 2.47%
  • Bits per Weight (BPW) for blk.10: 7.1661 bits

Block 11 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type BPW
102 blk.11.attn_k.weight Block 11 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q6_K 6.5625
103 blk.11.attn_norm.weight Block 11 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
104 blk.11.attn_output.weight Block 11 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q6_K 6.5625
105 blk.11.attn_q.weight Block 11 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q6_K 6.5625
106 blk.11.attn_v.weight Block 11 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q8_0 8.5000
107 blk.11.ffn_down.weight Block 11 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q8_0 8.5000
108 blk.11.ffn_gate.weight Block 11 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q6_K 6.5625
109 blk.11.ffn_norm.weight Block 11 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
110 blk.11.ffn_up.weight Block 11 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q6_K 6.5625
  • Total elements in blk.11: (~556M) 555755520
  • Percentage of total elements: 2.47%
  • Bits per Weight (BPW) for blk.11: 7.1661 bits

Block 12 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type BPW
111 blk.12.attn_k.weight Block 12 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q6_K 6.5625
112 blk.12.attn_norm.weight Block 12 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
113 blk.12.attn_output.weight Block 12 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q6_K 6.5625
114 blk.12.attn_q.weight Block 12 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q6_K 6.5625
115 blk.12.attn_v.weight Block 12 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q8_0 8.5000
116 blk.12.ffn_down.weight Block 12 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q8_0 8.5000
117 blk.12.ffn_gate.weight Block 12 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q6_K 6.5625
118 blk.12.ffn_norm.weight Block 12 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
119 blk.12.ffn_up.weight Block 12 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q6_K 6.5625
  • Total elements in blk.12: (~556M) 555755520
  • Percentage of total elements: 2.47%
  • Bits per Weight (BPW) for blk.12: 7.1661 bits

Block 13 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type BPW
120 blk.13.attn_k.weight Block 13 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q6_K 6.5625
121 blk.13.attn_norm.weight Block 13 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
122 blk.13.attn_output.weight Block 13 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q6_K 6.5625
123 blk.13.attn_q.weight Block 13 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q6_K 6.5625
124 blk.13.attn_v.weight Block 13 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q8_0 8.5000
125 blk.13.ffn_down.weight Block 13 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q8_0 8.5000
126 blk.13.ffn_gate.weight Block 13 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q6_K 6.5625
127 blk.13.ffn_norm.weight Block 13 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
128 blk.13.ffn_up.weight Block 13 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q6_K 6.5625
  • Total elements in blk.13: (~556M) 555755520
  • Percentage of total elements: 2.47%
  • Bits per Weight (BPW) for blk.13: 7.1661 bits

Block 14 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type BPW
129 blk.14.attn_k.weight Block 14 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q6_K 6.5625
130 blk.14.attn_norm.weight Block 14 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
131 blk.14.attn_output.weight Block 14 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q6_K 6.5625
132 blk.14.attn_q.weight Block 14 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q6_K 6.5625
133 blk.14.attn_v.weight Block 14 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q8_0 8.5000
134 blk.14.ffn_down.weight Block 14 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q8_0 8.5000
135 blk.14.ffn_gate.weight Block 14 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q6_K 6.5625
136 blk.14.ffn_norm.weight Block 14 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
137 blk.14.ffn_up.weight Block 14 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q6_K 6.5625
  • Total elements in blk.14: (~556M) 555755520
  • Percentage of total elements: 2.47%
  • Bits per Weight (BPW) for blk.14: 7.1661 bits

Block 15 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type BPW
138 blk.15.attn_k.weight Block 15 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q6_K 6.5625
139 blk.15.attn_norm.weight Block 15 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
140 blk.15.attn_output.weight Block 15 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q6_K 6.5625
141 blk.15.attn_q.weight Block 15 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q6_K 6.5625
142 blk.15.attn_v.weight Block 15 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q8_0 8.5000
143 blk.15.ffn_down.weight Block 15 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q8_0 8.5000
144 blk.15.ffn_gate.weight Block 15 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q6_K 6.5625
145 blk.15.ffn_norm.weight Block 15 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
146 blk.15.ffn_up.weight Block 15 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q6_K 6.5625
  • Total elements in blk.15: (~556M) 555755520
  • Percentage of total elements: 2.47%
  • Bits per Weight (BPW) for blk.15: 7.1661 bits

Block 16 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type BPW
147 blk.16.attn_k.weight Block 16 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q6_K 6.5625
148 blk.16.attn_norm.weight Block 16 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
149 blk.16.attn_output.weight Block 16 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q6_K 6.5625
150 blk.16.attn_q.weight Block 16 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q6_K 6.5625
151 blk.16.attn_v.weight Block 16 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q8_0 8.5000
152 blk.16.ffn_down.weight Block 16 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q8_0 8.5000
153 blk.16.ffn_gate.weight Block 16 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q6_K 6.5625
154 blk.16.ffn_norm.weight Block 16 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
155 blk.16.ffn_up.weight Block 16 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q6_K 6.5625
  • Total elements in blk.16: (~556M) 555755520
  • Percentage of total elements: 2.47%
  • Bits per Weight (BPW) for blk.16: 7.1661 bits

Block 17 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type BPW
156 blk.17.attn_k.weight Block 17 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q5_K 5.5000
157 blk.17.attn_norm.weight Block 17 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
158 blk.17.attn_output.weight Block 17 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q6_K 6.5625
159 blk.17.attn_q.weight Block 17 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q5_K 5.5000
160 blk.17.attn_v.weight Block 17 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q6_K 6.5625
161 blk.17.ffn_down.weight Block 17 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q8_0 8.5000
162 blk.17.ffn_gate.weight Block 17 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q6_K 6.5625
163 blk.17.ffn_norm.weight Block 17 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
164 blk.17.ffn_up.weight Block 17 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q6_K 6.5625
  • Total elements in blk.17: (~556M) 555755520
  • Percentage of total elements: 2.47%
  • Bits per Weight (BPW) for blk.17: 7.0977 bits

Block 18 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type BPW
165 blk.18.attn_k.weight Block 18 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q5_K 5.5000
166 blk.18.attn_norm.weight Block 18 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
167 blk.18.attn_output.weight Block 18 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q6_K 6.5625
168 blk.18.attn_q.weight Block 18 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q5_K 5.5000
169 blk.18.attn_v.weight Block 18 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q6_K 6.5625
170 blk.18.ffn_down.weight Block 18 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q8_0 8.5000
171 blk.18.ffn_gate.weight Block 18 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q6_K 6.5625
172 blk.18.ffn_norm.weight Block 18 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
173 blk.18.ffn_up.weight Block 18 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q6_K 6.5625
  • Total elements in blk.18: (~556M) 555755520
  • Percentage of total elements: 2.47%
  • Bits per Weight (BPW) for blk.18: 7.0977 bits

Block 19 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type BPW
174 blk.19.attn_k.weight Block 19 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q6_K 6.5625
175 blk.19.attn_norm.weight Block 19 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
176 blk.19.attn_output.weight Block 19 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q6_K 6.5625
177 blk.19.attn_q.weight Block 19 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q6_K 6.5625
178 blk.19.attn_v.weight Block 19 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q8_0 8.5000
179 blk.19.ffn_down.weight Block 19 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q8_0 8.5000
180 blk.19.ffn_gate.weight Block 19 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q6_K 6.5625
181 blk.19.ffn_norm.weight Block 19 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
182 blk.19.ffn_up.weight Block 19 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q6_K 6.5625
  • Total elements in blk.19: (~556M) 555755520
  • Percentage of total elements: 2.47%
  • Bits per Weight (BPW) for blk.19: 7.1661 bits

Block 20 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type BPW
183 blk.20.attn_k.weight Block 20 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q5_K 5.5000
184 blk.20.attn_norm.weight Block 20 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
185 blk.20.attn_output.weight Block 20 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q6_K 6.5625
186 blk.20.attn_q.weight Block 20 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q5_K 5.5000
187 blk.20.attn_v.weight Block 20 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q6_K 6.5625
188 blk.20.ffn_down.weight Block 20 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q8_0 8.5000
189 blk.20.ffn_gate.weight Block 20 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q5_K 5.5000
190 blk.20.ffn_norm.weight Block 20 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
191 blk.20.ffn_up.weight Block 20 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q5_K 5.5000
  • Total elements in blk.20: (~556M) 555755520
  • Percentage of total elements: 2.47%
  • Bits per Weight (BPW) for blk.20: 6.4562 bits

Block 21 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type BPW
192 blk.21.attn_k.weight Block 21 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q6_K 6.5625
193 blk.21.attn_norm.weight Block 21 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
194 blk.21.attn_output.weight Block 21 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q6_K 6.5625
195 blk.21.attn_q.weight Block 21 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q6_K 6.5625
196 blk.21.attn_v.weight Block 21 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q8_0 8.5000
197 blk.21.ffn_down.weight Block 21 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q8_0 8.5000
198 blk.21.ffn_gate.weight Block 21 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q5_K 5.5000
199 blk.21.ffn_norm.weight Block 21 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
200 blk.21.ffn_up.weight Block 21 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q5_K 5.5000
  • Total elements in blk.21: (~556M) 555755520
  • Percentage of total elements: 2.47%
  • Bits per Weight (BPW) for blk.21: 6.5246 bits

Block 22 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type BPW
201 blk.22.attn_k.weight Block 22 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q5_K 5.5000
202 blk.22.attn_norm.weight Block 22 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
203 blk.22.attn_output.weight Block 22 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q6_K 6.5625
204 blk.22.attn_q.weight Block 22 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q5_K 5.5000
205 blk.22.attn_v.weight Block 22 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q6_K 6.5625
206 blk.22.ffn_down.weight Block 22 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q8_0 8.5000
207 blk.22.ffn_gate.weight Block 22 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q5_K 5.5000
208 blk.22.ffn_norm.weight Block 22 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
209 blk.22.ffn_up.weight Block 22 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q5_K 5.5000
  • Total elements in blk.22: (~556M) 555755520
  • Percentage of total elements: 2.47%
  • Bits per Weight (BPW) for blk.22: 6.4562 bits

Block 23 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type BPW
210 blk.23.attn_k.weight Block 23 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q5_K 5.5000
211 blk.23.attn_norm.weight Block 23 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
212 blk.23.attn_output.weight Block 23 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q6_K 6.5625
213 blk.23.attn_q.weight Block 23 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q5_K 5.5000
214 blk.23.attn_v.weight Block 23 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q6_K 6.5625
215 blk.23.ffn_down.weight Block 23 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q8_0 8.5000
216 blk.23.ffn_gate.weight Block 23 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q5_K 5.5000
217 blk.23.ffn_norm.weight Block 23 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
218 blk.23.ffn_up.weight Block 23 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q5_K 5.5000
  • Total elements in blk.23: (~556M) 555755520
  • Percentage of total elements: 2.47%
  • Bits per Weight (BPW) for blk.23: 6.4562 bits

Block 24 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type BPW
219 blk.24.attn_k.weight Block 24 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q5_K 5.5000
220 blk.24.attn_norm.weight Block 24 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
221 blk.24.attn_output.weight Block 24 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q6_K 6.5625
222 blk.24.attn_q.weight Block 24 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q5_K 5.5000
223 blk.24.attn_v.weight Block 24 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q6_K 6.5625
224 blk.24.ffn_down.weight Block 24 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q8_0 8.5000
225 blk.24.ffn_gate.weight Block 24 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q5_K 5.5000
226 blk.24.ffn_norm.weight Block 24 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
227 blk.24.ffn_up.weight Block 24 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q5_K 5.5000
  • Total elements in blk.24: (~556M) 555755520
  • Percentage of total elements: 2.47%
  • Bits per Weight (BPW) for blk.24: 6.4562 bits

Block 25 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type BPW
228 blk.25.attn_k.weight Block 25 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q5_K 5.5000
229 blk.25.attn_norm.weight Block 25 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
230 blk.25.attn_output.weight Block 25 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q6_K 6.5625
231 blk.25.attn_q.weight Block 25 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q5_K 5.5000
232 blk.25.attn_v.weight Block 25 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q6_K 6.5625
233 blk.25.ffn_down.weight Block 25 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q8_0 8.5000
234 blk.25.ffn_gate.weight Block 25 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q5_K 5.5000
235 blk.25.ffn_norm.weight Block 25 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
236 blk.25.ffn_up.weight Block 25 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q5_K 5.5000
  • Total elements in blk.25: (~556M) 555755520
  • Percentage of total elements: 2.47%
  • Bits per Weight (BPW) for blk.25: 6.4562 bits

Block 26 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type BPW
237 blk.26.attn_k.weight Block 26 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q5_K 5.5000
238 blk.26.attn_norm.weight Block 26 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
239 blk.26.attn_output.weight Block 26 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q6_K 6.5625
240 blk.26.attn_q.weight Block 26 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q5_K 5.5000
241 blk.26.attn_v.weight Block 26 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q6_K 6.5625
242 blk.26.ffn_down.weight Block 26 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q8_0 8.5000
243 blk.26.ffn_gate.weight Block 26 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q5_K 5.5000
244 blk.26.ffn_norm.weight Block 26 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
245 blk.26.ffn_up.weight Block 26 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q5_K 5.5000
  • Total elements in blk.26: (~556M) 555755520
  • Percentage of total elements: 2.47%
  • Bits per Weight (BPW) for blk.26: 6.4562 bits

Block 27 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type BPW
246 blk.27.attn_k.weight Block 27 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q6_K 6.5625
247 blk.27.attn_norm.weight Block 27 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
248 blk.27.attn_output.weight Block 27 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q6_K 6.5625
249 blk.27.attn_q.weight Block 27 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q6_K 6.5625
250 blk.27.attn_v.weight Block 27 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q8_0 8.5000
251 blk.27.ffn_down.weight Block 27 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q8_0 8.5000
252 blk.27.ffn_gate.weight Block 27 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q5_K 5.5000
253 blk.27.ffn_norm.weight Block 27 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
254 blk.27.ffn_up.weight Block 27 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q5_K 5.5000
  • Total elements in blk.27: (~556M) 555755520
  • Percentage of total elements: 2.47%
  • Bits per Weight (BPW) for blk.27: 6.5246 bits

Block 28 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type BPW
255 blk.28.attn_k.weight Block 28 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q5_K 5.5000
256 blk.28.attn_norm.weight Block 28 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
257 blk.28.attn_output.weight Block 28 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q6_K 6.5625
258 blk.28.attn_q.weight Block 28 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q5_K 5.5000
259 blk.28.attn_v.weight Block 28 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q6_K 6.5625
260 blk.28.ffn_down.weight Block 28 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q8_0 8.5000
261 blk.28.ffn_gate.weight Block 28 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q5_K 5.5000
262 blk.28.ffn_norm.weight Block 28 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
263 blk.28.ffn_up.weight Block 28 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q5_K 5.5000
  • Total elements in blk.28: (~556M) 555755520
  • Percentage of total elements: 2.47%
  • Bits per Weight (BPW) for blk.28: 6.4562 bits

Block 29 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type BPW
264 blk.29.attn_k.weight Block 29 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q5_K 5.5000
265 blk.29.attn_norm.weight Block 29 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
266 blk.29.attn_output.weight Block 29 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q6_K 6.5625
267 blk.29.attn_q.weight Block 29 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q5_K 5.5000
268 blk.29.attn_v.weight Block 29 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q6_K 6.5625
269 blk.29.ffn_down.weight Block 29 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q8_0 8.5000
270 blk.29.ffn_gate.weight Block 29 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q5_K 5.5000
271 blk.29.ffn_norm.weight Block 29 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
272 blk.29.ffn_up.weight Block 29 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q5_K 5.5000
  • Total elements in blk.29: (~556M) 555755520
  • Percentage of total elements: 2.47%
  • Bits per Weight (BPW) for blk.29: 6.4562 bits

Block 30 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type BPW
273 blk.30.attn_k.weight Block 30 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q5_K 5.5000
274 blk.30.attn_norm.weight Block 30 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
275 blk.30.attn_output.weight Block 30 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q6_K 6.5625
276 blk.30.attn_q.weight Block 30 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q5_K 5.5000
277 blk.30.attn_v.weight Block 30 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q6_K 6.5625
278 blk.30.ffn_down.weight Block 30 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q8_0 8.5000
279 blk.30.ffn_gate.weight Block 30 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q5_K 5.5000
280 blk.30.ffn_norm.weight Block 30 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
281 blk.30.ffn_up.weight Block 30 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q5_K 5.5000
  • Total elements in blk.30: (~556M) 555755520
  • Percentage of total elements: 2.47%
  • Bits per Weight (BPW) for blk.30: 6.4562 bits

Block 31 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type BPW
282 blk.31.attn_k.weight Block 31 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q5_K 5.5000
283 blk.31.attn_norm.weight Block 31 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
284 blk.31.attn_output.weight Block 31 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q6_K 6.5625
285 blk.31.attn_q.weight Block 31 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q5_K 5.5000
286 blk.31.attn_v.weight Block 31 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q6_K 6.5625
287 blk.31.ffn_down.weight Block 31 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q8_0 8.5000
288 blk.31.ffn_gate.weight Block 31 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q5_K 5.5000
289 blk.31.ffn_norm.weight Block 31 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
290 blk.31.ffn_up.weight Block 31 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q5_K 5.5000
  • Total elements in blk.31: (~556M) 555755520
  • Percentage of total elements: 2.47%
  • Bits per Weight (BPW) for blk.31: 6.4562 bits

Block 32 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type BPW
291 blk.32.attn_k.weight Block 32 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q5_K 5.5000
292 blk.32.attn_norm.weight Block 32 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
293 blk.32.attn_output.weight Block 32 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q6_K 6.5625
294 blk.32.attn_q.weight Block 32 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q5_K 5.5000
295 blk.32.attn_v.weight Block 32 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q6_K 6.5625
296 blk.32.ffn_down.weight Block 32 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q8_0 8.5000
297 blk.32.ffn_gate.weight Block 32 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q5_K 5.5000
298 blk.32.ffn_norm.weight Block 32 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
299 blk.32.ffn_up.weight Block 32 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q5_K 5.5000
  • Total elements in blk.32: (~556M) 555755520
  • Percentage of total elements: 2.47%
  • Bits per Weight (BPW) for blk.32: 6.4562 bits

Block 33 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type BPW
300 blk.33.attn_k.weight Block 33 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q5_K 5.5000
301 blk.33.attn_norm.weight Block 33 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
302 blk.33.attn_output.weight Block 33 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q6_K 6.5625
303 blk.33.attn_q.weight Block 33 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q5_K 5.5000
304 blk.33.attn_v.weight Block 33 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q6_K 6.5625
305 blk.33.ffn_down.weight Block 33 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q8_0 8.5000
306 blk.33.ffn_gate.weight Block 33 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q5_K 5.5000
307 blk.33.ffn_norm.weight Block 33 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
308 blk.33.ffn_up.weight Block 33 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q5_K 5.5000
  • Total elements in blk.33: (~556M) 555755520
  • Percentage of total elements: 2.47%
  • Bits per Weight (BPW) for blk.33: 6.4562 bits

Block 34 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type BPW
309 blk.34.attn_k.weight Block 34 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q5_K 5.5000
310 blk.34.attn_norm.weight Block 34 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
311 blk.34.attn_output.weight Block 34 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q6_K 6.5625
312 blk.34.attn_q.weight Block 34 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q5_K 5.5000
313 blk.34.attn_v.weight Block 34 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q6_K 6.5625
314 blk.34.ffn_down.weight Block 34 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q8_0 8.5000
315 blk.34.ffn_gate.weight Block 34 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q5_K 5.5000
316 blk.34.ffn_norm.weight Block 34 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
317 blk.34.ffn_up.weight Block 34 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q5_K 5.5000
  • Total elements in blk.34: (~556M) 555755520
  • Percentage of total elements: 2.47%
  • Bits per Weight (BPW) for blk.34: 6.4562 bits

Block 35 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type BPW
318 blk.35.attn_k.weight Block 35 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q5_K 5.5000
319 blk.35.attn_norm.weight Block 35 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
320 blk.35.attn_output.weight Block 35 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q6_K 6.5625
321 blk.35.attn_q.weight Block 35 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q5_K 5.5000
322 blk.35.attn_v.weight Block 35 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q6_K 6.5625
323 blk.35.ffn_down.weight Block 35 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q8_0 8.5000
324 blk.35.ffn_gate.weight Block 35 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q5_K 5.5000
325 blk.35.ffn_norm.weight Block 35 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
326 blk.35.ffn_up.weight Block 35 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q5_K 5.5000
  • Total elements in blk.35: (~556M) 555755520
  • Percentage of total elements: 2.47%
  • Bits per Weight (BPW) for blk.35: 6.4562 bits

Block 36 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type BPW
327 blk.36.attn_k.weight Block 36 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q5_K 5.5000
328 blk.36.attn_norm.weight Block 36 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
329 blk.36.attn_output.weight Block 36 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q6_K 6.5625
330 blk.36.attn_q.weight Block 36 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q5_K 5.5000
331 blk.36.attn_v.weight Block 36 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q6_K 6.5625
332 blk.36.ffn_down.weight Block 36 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q8_0 8.5000
333 blk.36.ffn_gate.weight Block 36 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q5_K 5.5000
334 blk.36.ffn_norm.weight Block 36 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
335 blk.36.ffn_up.weight Block 36 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q5_K 5.5000
  • Total elements in blk.36: (~556M) 555755520
  • Percentage of total elements: 2.47%
  • Bits per Weight (BPW) for blk.36: 6.4562 bits

Block 37 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type BPW
336 blk.37.attn_k.weight Block 37 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q5_K 5.5000
337 blk.37.attn_norm.weight Block 37 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
338 blk.37.attn_output.weight Block 37 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q6_K 6.5625
339 blk.37.attn_q.weight Block 37 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q5_K 5.5000
340 blk.37.attn_v.weight Block 37 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q6_K 6.5625
341 blk.37.ffn_down.weight Block 37 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q8_0 8.5000
342 blk.37.ffn_gate.weight Block 37 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q5_K 5.5000
343 blk.37.ffn_norm.weight Block 37 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
344 blk.37.ffn_up.weight Block 37 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q5_K 5.5000
  • Total elements in blk.37: (~556M) 555755520
  • Percentage of total elements: 2.47%
  • Bits per Weight (BPW) for blk.37: 6.4562 bits

Total BPW for Mistral-Small-3.2-24B-Instruct-pruned-Q6_K.gguf: 6.7205 bits