Mistral-Small-3.2-24B-Instruct-2506-pruned-GGUF / scores /Mistral-Small-3.2-24B-Instruct-2506-pruned-Q5_K_S.md
eaddario's picture
Add GGUF internal file structure
650665a verified

Mistral-Small-3.2-24B-Instruct-pruned-Q5_K_S.gguf - GGUF Internal File Dump

  • Endian: LITTLE endian

Key Value Metadata Store

There are 49 key-value pairs in this file

POS TYPE Count Key Value
1 UINT32 1 GGUF.version 3
2 UINT64 1 GGUF.tensor_count 345
3 UINT64 1 GGUF.kv_count 46
4 STRING 1 general.architecture llama
5 STRING 1 general.type model
6 STRING 1 general.name Mistral Small 3.2 24B Instruct 2506
7 STRING 1 general.version 2506
8 STRING 1 general.finetune Instruct
9 STRING 1 general.basename Mistral-Small-3.2
10 STRING 1 general.size_label 24B
11 STRING 1 general.license apache-2.0
12 UINT32 1 general.base_model.count 1
13 STRING 1 general.base_model.0.name Mistral Small 3.1 24B Base 2503
14 STRING 1 general.base_model.0.version 2503
15 STRING 1 general.base_model.0.organization Mistralai
16 STRING 1 general.base_model.0.repo_url https://huggingface.co/mistral...istral-Small-3.1-24B-Base-2503
17 [STRING] 1 general.tags [ image-text-to-text ]
18 [STRING] 24 general.languages [ en, fr, de, es, pt, ... ]
19 UINT32 1 llama.context_length 131072
20 UINT32 1 llama.embedding_length 5120
21 UINT32 1 llama.feed_forward_length 32768
22 UINT32 1 llama.attention.head_count 32
23 UINT32 1 llama.attention.head_count_kv 8
24 FLOAT32 1 llama.rope.freq_base 1000000000.0
25 FLOAT32 1 llama.attention.layer_norm_rms_epsilon 1e-05
26 UINT32 1 llama.attention.key_length 128
27 UINT32 1 llama.attention.value_length 128
28 UINT32 1 llama.vocab_size 131072
29 UINT32 1 llama.rope.dimension_count 128
30 STRING 1 tokenizer.ggml.model gpt2
31 STRING 1 tokenizer.ggml.pre tekken
32 [STRING] 131072 tokenizer.ggml.tokens [ <unk>, <s>, </s>, [INST], [/INST], ... ]
33 [INT32] 131072 tokenizer.ggml.token_type [ 3, 3, 3, 3, 3, 3, 3, ... ]
34 [STRING] 269443 tokenizer.ggml.merges [ Ġ Ġ, Ġ t, e r, i n, Ġ ĠĠĠ, ... ]
35 UINT32 1 tokenizer.ggml.bos_token_id 1
36 UINT32 1 tokenizer.ggml.eos_token_id 2
37 UINT32 1 tokenizer.ggml.unknown_token_id 0
38 UINT32 1 tokenizer.ggml.padding_token_id 11
39 BOOL 1 tokenizer.ggml.add_bos_token True
40 BOOL 1 tokenizer.ggml.add_sep_token False
41 BOOL 1 tokenizer.ggml.add_eos_token False
42 BOOL 1 tokenizer.ggml.add_space_prefix False
43 UINT32 1 llama.block_count 38
44 UINT32 1 general.quantization_version 2
45 UINT32 1 general.file_type 16
46 STRING 1 quantize.imatrix.file ./imatrix/imatrix-Mistral-Smal...24B-Instruct-pruned-medium.dat
47 STRING 1 quantize.imatrix.dataset ../../datasets/imatrix/text_eur_medium.txt
48 UINT32 1 quantize.imatrix.entries_count 266
49 UINT32 1 quantize.imatrix.chunks_count 1778

Tensors Overview ~22B Elements

Total number of elements in all tensors: 22460892160 Elements

Tensor Data Offset

This table contains the offset and data segment relative to start of file

T_ID Tensor Layer Name Data Offset (B) Data Size (B)
0 output.weight 0x7841e0 0x1b800000
1 output_norm.weight 0x1bf841e0 0x5000
2 token_embd.weight 0x1bf891e0 0x11300000
3 blk.0.attn_k.weight 0x2d2891e0 0x370000
4 blk.0.attn_norm.weight 0x2d5f91e0 0x5000
5 blk.0.attn_output.weight 0x2d5fe1e0 0xdc0000
6 blk.0.attn_q.weight 0x2e3be1e0 0xdc0000
7 blk.0.attn_v.weight 0x2f17e1e0 0x370000
8 blk.0.ffn_down.weight 0x2f4ee1e0 0x8340000
9 blk.0.ffn_gate.weight 0x3782e1e0 0x6e00000
10 blk.0.ffn_norm.weight 0x3e62e1e0 0x5000
11 blk.0.ffn_up.weight 0x3e6331e0 0x6e00000
12 blk.1.attn_k.weight 0x454331e0 0x370000
13 blk.1.attn_norm.weight 0x457a31e0 0x5000
14 blk.1.attn_output.weight 0x457a81e0 0xdc0000
15 blk.1.attn_q.weight 0x465681e0 0xdc0000
16 blk.1.attn_v.weight 0x473281e0 0x370000
17 blk.1.ffn_down.weight 0x476981e0 0x8340000
18 blk.1.ffn_gate.weight 0x4f9d81e0 0x6e00000
19 blk.1.ffn_norm.weight 0x567d81e0 0x5000
20 blk.1.ffn_up.weight 0x567dd1e0 0x6e00000
21 blk.2.attn_k.weight 0x5d5dd1e0 0x370000
22 blk.2.attn_norm.weight 0x5d94d1e0 0x5000
23 blk.2.attn_output.weight 0x5d9521e0 0xdc0000
24 blk.2.attn_q.weight 0x5e7121e0 0xdc0000
25 blk.2.attn_v.weight 0x5f4d21e0 0x370000
26 blk.2.ffn_down.weight 0x5f8421e0 0x8340000
27 blk.2.ffn_gate.weight 0x67b821e0 0x6e00000
28 blk.2.ffn_norm.weight 0x6e9821e0 0x5000
29 blk.2.ffn_up.weight 0x6e9871e0 0x6e00000
30 blk.3.attn_k.weight 0x757871e0 0x2d0000
31 blk.3.attn_norm.weight 0x75a571e0 0x5000
32 blk.3.attn_output.weight 0x75a5c1e0 0xdc0000
33 blk.3.attn_q.weight 0x7681c1e0 0xb40000
34 blk.3.attn_v.weight 0x7735c1e0 0x370000
35 blk.3.ffn_down.weight 0x776cc1e0 0x8340000
36 blk.3.ffn_gate.weight 0x7fa0c1e0 0x6e00000
37 blk.3.ffn_norm.weight 0x8680c1e0 0x5000
38 blk.3.ffn_up.weight 0x868111e0 0x6e00000
39 blk.4.attn_k.weight 0x8d6111e0 0x370000
40 blk.4.attn_norm.weight 0x8d9811e0 0x5000
41 blk.4.attn_output.weight 0x8d9861e0 0xdc0000
42 blk.4.attn_q.weight 0x8e7461e0 0xdc0000
43 blk.4.attn_v.weight 0x8f5061e0 0x370000
44 blk.4.ffn_down.weight 0x8f8761e0 0x8340000
45 blk.4.ffn_gate.weight 0x97bb61e0 0x6e00000
46 blk.4.ffn_norm.weight 0x9e9b61e0 0x5000
47 blk.4.ffn_up.weight 0x9e9bb1e0 0x6e00000
48 blk.5.attn_k.weight 0xa57bb1e0 0x370000
49 blk.5.attn_norm.weight 0xa5b2b1e0 0x5000
50 blk.5.attn_output.weight 0xa5b301e0 0xdc0000
51 blk.5.attn_q.weight 0xa68f01e0 0xdc0000
52 blk.5.attn_v.weight 0xa76b01e0 0x370000
53 blk.5.ffn_down.weight 0xa7a201e0 0x8340000
54 blk.5.ffn_gate.weight 0xafd601e0 0x6e00000
55 blk.5.ffn_norm.weight 0xb6b601e0 0x5000
56 blk.5.ffn_up.weight 0xb6b651e0 0x6e00000
57 blk.6.attn_k.weight 0xbd9651e0 0x370000
58 blk.6.attn_norm.weight 0xbdcd51e0 0x5000
59 blk.6.attn_output.weight 0xbdcda1e0 0xdc0000
60 blk.6.attn_q.weight 0xbea9a1e0 0xdc0000
61 blk.6.attn_v.weight 0xbf85a1e0 0x370000
62 blk.6.ffn_down.weight 0xbfbca1e0 0x8340000
63 blk.6.ffn_gate.weight 0xc7f0a1e0 0x6e00000
64 blk.6.ffn_norm.weight 0xced0a1e0 0x5000
65 blk.6.ffn_up.weight 0xced0f1e0 0x6e00000
66 blk.7.attn_k.weight 0xd5b0f1e0 0x370000
67 blk.7.attn_norm.weight 0xd5e7f1e0 0x5000
68 blk.7.attn_output.weight 0xd5e841e0 0xdc0000
69 blk.7.attn_q.weight 0xd6c441e0 0xdc0000
70 blk.7.attn_v.weight 0xd7a041e0 0x370000
71 blk.7.ffn_down.weight 0xd7d741e0 0x8340000
72 blk.7.ffn_gate.weight 0xe00b41e0 0x6e00000
73 blk.7.ffn_norm.weight 0xe6eb41e0 0x5000
74 blk.7.ffn_up.weight 0xe6eb91e0 0x6e00000
75 blk.8.attn_k.weight 0xedcb91e0 0x370000
76 blk.8.attn_norm.weight 0xee0291e0 0x5000
77 blk.8.attn_output.weight 0xee02e1e0 0xdc0000
78 blk.8.attn_q.weight 0xeedee1e0 0xdc0000
79 blk.8.attn_v.weight 0xefbae1e0 0x370000
80 blk.8.ffn_down.weight 0xeff1e1e0 0x8340000
81 blk.8.ffn_gate.weight 0xf825e1e0 0x6e00000
82 blk.8.ffn_norm.weight 0xff05e1e0 0x5000
83 blk.8.ffn_up.weight 0xff0631e0 0x6e00000
84 blk.9.attn_k.weight 0x105e631e0 0x370000
85 blk.9.attn_norm.weight 0x1061d31e0 0x5000
86 blk.9.attn_output.weight 0x1061d81e0 0xdc0000
87 blk.9.attn_q.weight 0x106f981e0 0xdc0000
88 blk.9.attn_v.weight 0x107d581e0 0x370000
89 blk.9.ffn_down.weight 0x1080c81e0 0x8340000
90 blk.9.ffn_gate.weight 0x1104081e0 0x6e00000
91 blk.9.ffn_norm.weight 0x1172081e0 0x5000
92 blk.9.ffn_up.weight 0x11720d1e0 0x6e00000
93 blk.10.attn_k.weight 0x11e00d1e0 0x2d0000
94 blk.10.attn_norm.weight 0x11e2dd1e0 0x5000
95 blk.10.attn_output.weight 0x11e2e21e0 0xdc0000
96 blk.10.attn_q.weight 0x11f0a21e0 0xb40000
97 blk.10.attn_v.weight 0x11fbe21e0 0x370000
98 blk.10.ffn_down.weight 0x11ff521e0 0x8340000
99 blk.10.ffn_gate.weight 0x1282921e0 0x5a00000
100 blk.10.ffn_norm.weight 0x12dc921e0 0x5000
101 blk.10.ffn_up.weight 0x12dc971e0 0x5a00000
102 blk.11.attn_k.weight 0x1336971e0 0x370000
103 blk.11.attn_norm.weight 0x133a071e0 0x5000
104 blk.11.attn_output.weight 0x133a0c1e0 0xdc0000
105 blk.11.attn_q.weight 0x1347cc1e0 0xdc0000
106 blk.11.attn_v.weight 0x13558c1e0 0x370000
107 blk.11.ffn_down.weight 0x1358fc1e0 0x8340000
108 blk.11.ffn_gate.weight 0x13dc3c1e0 0x5a00000
109 blk.11.ffn_norm.weight 0x14363c1e0 0x5000
110 blk.11.ffn_up.weight 0x1436411e0 0x5a00000
111 blk.12.attn_k.weight 0x1490411e0 0x2d0000
112 blk.12.attn_norm.weight 0x1493111e0 0x5000
113 blk.12.attn_output.weight 0x1493161e0 0xdc0000
114 blk.12.attn_q.weight 0x14a0d61e0 0xb40000
115 blk.12.attn_v.weight 0x14ac161e0 0x370000
116 blk.12.ffn_down.weight 0x14af861e0 0x8340000
117 blk.12.ffn_gate.weight 0x1532c61e0 0x5a00000
118 blk.12.ffn_norm.weight 0x158cc61e0 0x5000
119 blk.12.ffn_up.weight 0x158ccb1e0 0x5a00000
120 blk.13.attn_k.weight 0x15e6cb1e0 0x2d0000
121 blk.13.attn_norm.weight 0x15e99b1e0 0x5000
122 blk.13.attn_output.weight 0x15e9a01e0 0xdc0000
123 blk.13.attn_q.weight 0x15f7601e0 0xb40000
124 blk.13.attn_v.weight 0x1602a01e0 0x370000
125 blk.13.ffn_down.weight 0x1606101e0 0x8340000
126 blk.13.ffn_gate.weight 0x1689501e0 0x5a00000
127 blk.13.ffn_norm.weight 0x16e3501e0 0x5000
128 blk.13.ffn_up.weight 0x16e3551e0 0x5a00000
129 blk.14.attn_k.weight 0x173d551e0 0x2d0000
130 blk.14.attn_norm.weight 0x1740251e0 0x5000
131 blk.14.attn_output.weight 0x17402a1e0 0xdc0000
132 blk.14.attn_q.weight 0x174dea1e0 0xb40000
133 blk.14.attn_v.weight 0x17592a1e0 0x370000
134 blk.14.ffn_down.weight 0x175c9a1e0 0x8340000
135 blk.14.ffn_gate.weight 0x17dfda1e0 0x5a00000
136 blk.14.ffn_norm.weight 0x1839da1e0 0x5000
137 blk.14.ffn_up.weight 0x1839df1e0 0x5a00000
138 blk.15.attn_k.weight 0x1893df1e0 0x2d0000
139 blk.15.attn_norm.weight 0x1896af1e0 0x5000
140 blk.15.attn_output.weight 0x1896b41e0 0xdc0000
141 blk.15.attn_q.weight 0x18a4741e0 0xb40000
142 blk.15.attn_v.weight 0x18afb41e0 0x370000
143 blk.15.ffn_down.weight 0x18b3241e0 0x8340000
144 blk.15.ffn_gate.weight 0x1936641e0 0x5a00000
145 blk.15.ffn_norm.weight 0x1990641e0 0x5000
146 blk.15.ffn_up.weight 0x1990691e0 0x5a00000
147 blk.16.attn_k.weight 0x19ea691e0 0x2d0000
148 blk.16.attn_norm.weight 0x19ed391e0 0x5000
149 blk.16.attn_output.weight 0x19ed3e1e0 0xdc0000
150 blk.16.attn_q.weight 0x19fafe1e0 0xb40000
151 blk.16.attn_v.weight 0x1a063e1e0 0x370000
152 blk.16.ffn_down.weight 0x1a09ae1e0 0x8340000
153 blk.16.ffn_gate.weight 0x1a8cee1e0 0x5a00000
154 blk.16.ffn_norm.weight 0x1ae6ee1e0 0x5000
155 blk.16.ffn_up.weight 0x1ae6f31e0 0x5a00000
156 blk.17.attn_k.weight 0x1b40f31e0 0x2d0000
157 blk.17.attn_norm.weight 0x1b43c31e0 0x5000
158 blk.17.attn_output.weight 0x1b43c81e0 0xdc0000
159 blk.17.attn_q.weight 0x1b51881e0 0xb40000
160 blk.17.attn_v.weight 0x1b5cc81e0 0x370000
161 blk.17.ffn_down.weight 0x1b60381e0 0x8340000
162 blk.17.ffn_gate.weight 0x1be3781e0 0x5a00000
163 blk.17.ffn_norm.weight 0x1c3d781e0 0x5000
164 blk.17.ffn_up.weight 0x1c3d7d1e0 0x5a00000
165 blk.18.attn_k.weight 0x1c977d1e0 0x2d0000
166 blk.18.attn_norm.weight 0x1c9a4d1e0 0x5000
167 blk.18.attn_output.weight 0x1c9a521e0 0xdc0000
168 blk.18.attn_q.weight 0x1ca8121e0 0xb40000
169 blk.18.attn_v.weight 0x1cb3521e0 0x370000
170 blk.18.ffn_down.weight 0x1cb6c21e0 0x8340000
171 blk.18.ffn_gate.weight 0x1d3a021e0 0x5a00000
172 blk.18.ffn_norm.weight 0x1d94021e0 0x5000
173 blk.18.ffn_up.weight 0x1d94071e0 0x5a00000
174 blk.19.attn_k.weight 0x1dee071e0 0x2d0000
175 blk.19.attn_norm.weight 0x1df0d71e0 0x5000
176 blk.19.attn_output.weight 0x1df0dc1e0 0xdc0000
177 blk.19.attn_q.weight 0x1dfe9c1e0 0xb40000
178 blk.19.attn_v.weight 0x1e09dc1e0 0x370000
179 blk.19.ffn_down.weight 0x1e0d4c1e0 0x8340000
180 blk.19.ffn_gate.weight 0x1e908c1e0 0x5a00000
181 blk.19.ffn_norm.weight 0x1eea8c1e0 0x5000
182 blk.19.ffn_up.weight 0x1eea911e0 0x5a00000
183 blk.20.attn_k.weight 0x1f44911e0 0x2d0000
184 blk.20.attn_norm.weight 0x1f47611e0 0x5000
185 blk.20.attn_output.weight 0x1f47661e0 0xdc0000
186 blk.20.attn_q.weight 0x1f55261e0 0xb40000
187 blk.20.attn_v.weight 0x1f60661e0 0x370000
188 blk.20.ffn_down.weight 0x1f63d61e0 0x6e00000
189 blk.20.ffn_gate.weight 0x1fd1d61e0 0x5a00000
190 blk.20.ffn_norm.weight 0x202bd61e0 0x5000
191 blk.20.ffn_up.weight 0x202bdb1e0 0x5a00000
192 blk.21.attn_k.weight 0x2085db1e0 0x2d0000
193 blk.21.attn_norm.weight 0x2088ab1e0 0x5000
194 blk.21.attn_output.weight 0x2088b01e0 0xdc0000
195 blk.21.attn_q.weight 0x2096701e0 0xb40000
196 blk.21.attn_v.weight 0x20a1b01e0 0x370000
197 blk.21.ffn_down.weight 0x20a5201e0 0x6e00000
198 blk.21.ffn_gate.weight 0x2113201e0 0x5a00000
199 blk.21.ffn_norm.weight 0x216d201e0 0x5000
200 blk.21.ffn_up.weight 0x216d251e0 0x5a00000
201 blk.22.attn_k.weight 0x21c7251e0 0x2d0000
202 blk.22.attn_norm.weight 0x21c9f51e0 0x5000
203 blk.22.attn_output.weight 0x21c9fa1e0 0xdc0000
204 blk.22.attn_q.weight 0x21d7ba1e0 0xb40000
205 blk.22.attn_v.weight 0x21e2fa1e0 0x370000
206 blk.22.ffn_down.weight 0x21e66a1e0 0x6e00000
207 blk.22.ffn_gate.weight 0x22546a1e0 0x5a00000
208 blk.22.ffn_norm.weight 0x22ae6a1e0 0x5000
209 blk.22.ffn_up.weight 0x22ae6f1e0 0x5a00000
210 blk.23.attn_k.weight 0x23086f1e0 0x2d0000
211 blk.23.attn_norm.weight 0x230b3f1e0 0x5000
212 blk.23.attn_output.weight 0x230b441e0 0xdc0000
213 blk.23.attn_q.weight 0x2319041e0 0xb40000
214 blk.23.attn_v.weight 0x2324441e0 0x370000
215 blk.23.ffn_down.weight 0x2327b41e0 0x6e00000
216 blk.23.ffn_gate.weight 0x2395b41e0 0x5a00000
217 blk.23.ffn_norm.weight 0x23efb41e0 0x5000
218 blk.23.ffn_up.weight 0x23efb91e0 0x5a00000
219 blk.24.attn_k.weight 0x2449b91e0 0x2d0000
220 blk.24.attn_norm.weight 0x244c891e0 0x5000
221 blk.24.attn_output.weight 0x244c8e1e0 0xdc0000
222 blk.24.attn_q.weight 0x245a4e1e0 0xb40000
223 blk.24.attn_v.weight 0x24658e1e0 0x370000
224 blk.24.ffn_down.weight 0x2468fe1e0 0x6e00000
225 blk.24.ffn_gate.weight 0x24d6fe1e0 0x5a00000
226 blk.24.ffn_norm.weight 0x2530fe1e0 0x5000
227 blk.24.ffn_up.weight 0x2531031e0 0x5a00000
228 blk.25.attn_k.weight 0x258b031e0 0x2d0000
229 blk.25.attn_norm.weight 0x258dd31e0 0x5000
230 blk.25.attn_output.weight 0x258dd81e0 0xdc0000
231 blk.25.attn_q.weight 0x259b981e0 0xb40000
232 blk.25.attn_v.weight 0x25a6d81e0 0x370000
233 blk.25.ffn_down.weight 0x25aa481e0 0x6e00000
234 blk.25.ffn_gate.weight 0x2618481e0 0x5a00000
235 blk.25.ffn_norm.weight 0x2672481e0 0x5000
236 blk.25.ffn_up.weight 0x26724d1e0 0x5a00000
237 blk.26.attn_k.weight 0x26cc4d1e0 0x2d0000
238 blk.26.attn_norm.weight 0x26cf1d1e0 0x5000
239 blk.26.attn_output.weight 0x26cf221e0 0xdc0000
240 blk.26.attn_q.weight 0x26dce21e0 0xb40000
241 blk.26.attn_v.weight 0x26e8221e0 0x370000
242 blk.26.ffn_down.weight 0x26eb921e0 0x6e00000
243 blk.26.ffn_gate.weight 0x2759921e0 0x5a00000
244 blk.26.ffn_norm.weight 0x27b3921e0 0x5000
245 blk.26.ffn_up.weight 0x27b3971e0 0x5a00000
246 blk.27.attn_k.weight 0x280d971e0 0x2d0000
247 blk.27.attn_norm.weight 0x2810671e0 0x5000
248 blk.27.attn_output.weight 0x28106c1e0 0xdc0000
249 blk.27.attn_q.weight 0x281e2c1e0 0xb40000
250 blk.27.attn_v.weight 0x28296c1e0 0x370000
251 blk.27.ffn_down.weight 0x282cdc1e0 0x6e00000
252 blk.27.ffn_gate.weight 0x289adc1e0 0x5a00000
253 blk.27.ffn_norm.weight 0x28f4dc1e0 0x5000
254 blk.27.ffn_up.weight 0x28f4e11e0 0x5a00000
255 blk.28.attn_k.weight 0x294ee11e0 0x2d0000
256 blk.28.attn_norm.weight 0x2951b11e0 0x5000
257 blk.28.attn_output.weight 0x2951b61e0 0xdc0000
258 blk.28.attn_q.weight 0x295f761e0 0xb40000
259 blk.28.attn_v.weight 0x296ab61e0 0x370000
260 blk.28.ffn_down.weight 0x296e261e0 0x6e00000
261 blk.28.ffn_gate.weight 0x29dc261e0 0x5a00000
262 blk.28.ffn_norm.weight 0x2a36261e0 0x5000
263 blk.28.ffn_up.weight 0x2a362b1e0 0x5a00000
264 blk.29.attn_k.weight 0x2a902b1e0 0x2d0000
265 blk.29.attn_norm.weight 0x2a92fb1e0 0x5000
266 blk.29.attn_output.weight 0x2a93001e0 0xdc0000
267 blk.29.attn_q.weight 0x2aa0c01e0 0xb40000
268 blk.29.attn_v.weight 0x2aac001e0 0x370000
269 blk.29.ffn_down.weight 0x2aaf701e0 0x6e00000
270 blk.29.ffn_gate.weight 0x2b1d701e0 0x5a00000
271 blk.29.ffn_norm.weight 0x2b77701e0 0x5000
272 blk.29.ffn_up.weight 0x2b77751e0 0x5a00000
273 blk.30.attn_k.weight 0x2bd1751e0 0x2d0000
274 blk.30.attn_norm.weight 0x2bd4451e0 0x5000
275 blk.30.attn_output.weight 0x2bd44a1e0 0xdc0000
276 blk.30.attn_q.weight 0x2be20a1e0 0xb40000
277 blk.30.attn_v.weight 0x2bed4a1e0 0x370000
278 blk.30.ffn_down.weight 0x2bf0ba1e0 0x6e00000
279 blk.30.ffn_gate.weight 0x2c5eba1e0 0x5a00000
280 blk.30.ffn_norm.weight 0x2cb8ba1e0 0x5000
281 blk.30.ffn_up.weight 0x2cb8bf1e0 0x5a00000
282 blk.31.attn_k.weight 0x2d12bf1e0 0x2d0000
283 blk.31.attn_norm.weight 0x2d158f1e0 0x5000
284 blk.31.attn_output.weight 0x2d15941e0 0xdc0000
285 blk.31.attn_q.weight 0x2d23541e0 0xb40000
286 blk.31.attn_v.weight 0x2d2e941e0 0x370000
287 blk.31.ffn_down.weight 0x2d32041e0 0x6e00000
288 blk.31.ffn_gate.weight 0x2da0041e0 0x5a00000
289 blk.31.ffn_norm.weight 0x2dfa041e0 0x5000
290 blk.31.ffn_up.weight 0x2dfa091e0 0x5a00000
291 blk.32.attn_k.weight 0x2e54091e0 0x2d0000
292 blk.32.attn_norm.weight 0x2e56d91e0 0x5000
293 blk.32.attn_output.weight 0x2e56de1e0 0xdc0000
294 blk.32.attn_q.weight 0x2e649e1e0 0xb40000
295 blk.32.attn_v.weight 0x2e6fde1e0 0x370000
296 blk.32.ffn_down.weight 0x2e734e1e0 0x6e00000
297 blk.32.ffn_gate.weight 0x2ee14e1e0 0x5a00000
298 blk.32.ffn_norm.weight 0x2f3b4e1e0 0x5000
299 blk.32.ffn_up.weight 0x2f3b531e0 0x5a00000
300 blk.33.attn_k.weight 0x2f95531e0 0x2d0000
301 blk.33.attn_norm.weight 0x2f98231e0 0x5000
302 blk.33.attn_output.weight 0x2f98281e0 0xdc0000
303 blk.33.attn_q.weight 0x2fa5e81e0 0xb40000
304 blk.33.attn_v.weight 0x2fb1281e0 0x370000
305 blk.33.ffn_down.weight 0x2fb4981e0 0x6e00000
306 blk.33.ffn_gate.weight 0x3022981e0 0x5a00000
307 blk.33.ffn_norm.weight 0x307c981e0 0x5000
308 blk.33.ffn_up.weight 0x307c9d1e0 0x5a00000
309 blk.34.attn_k.weight 0x30d69d1e0 0x2d0000
310 blk.34.attn_norm.weight 0x30d96d1e0 0x5000
311 blk.34.attn_output.weight 0x30d9721e0 0xdc0000
312 blk.34.attn_q.weight 0x30e7321e0 0xb40000
313 blk.34.attn_v.weight 0x30f2721e0 0x370000
314 blk.34.ffn_down.weight 0x30f5e21e0 0x6e00000
315 blk.34.ffn_gate.weight 0x3163e21e0 0x5a00000
316 blk.34.ffn_norm.weight 0x31bde21e0 0x5000
317 blk.34.ffn_up.weight 0x31bde71e0 0x5a00000
318 blk.35.attn_k.weight 0x3217e71e0 0x2d0000
319 blk.35.attn_norm.weight 0x321ab71e0 0x5000
320 blk.35.attn_output.weight 0x321abc1e0 0xdc0000
321 blk.35.attn_q.weight 0x32287c1e0 0xb40000
322 blk.35.attn_v.weight 0x3233bc1e0 0x370000
323 blk.35.ffn_down.weight 0x32372c1e0 0x6e00000
324 blk.35.ffn_gate.weight 0x32a52c1e0 0x5a00000
325 blk.35.ffn_norm.weight 0x32ff2c1e0 0x5000
326 blk.35.ffn_up.weight 0x32ff311e0 0x5a00000
327 blk.36.attn_k.weight 0x3359311e0 0x2d0000
328 blk.36.attn_norm.weight 0x335c011e0 0x5000
329 blk.36.attn_output.weight 0x335c061e0 0xdc0000
330 blk.36.attn_q.weight 0x3369c61e0 0xb40000
331 blk.36.attn_v.weight 0x3375061e0 0x370000
332 blk.36.ffn_down.weight 0x3378761e0 0x6e00000
333 blk.36.ffn_gate.weight 0x33e6761e0 0x5a00000
334 blk.36.ffn_norm.weight 0x3440761e0 0x5000
335 blk.36.ffn_up.weight 0x34407b1e0 0x5a00000
336 blk.37.attn_k.weight 0x349a7b1e0 0x2d0000
337 blk.37.attn_norm.weight 0x349d4b1e0 0x5000
338 blk.37.attn_output.weight 0x349d501e0 0xdc0000
339 blk.37.attn_q.weight 0x34ab101e0 0xb40000
340 blk.37.attn_v.weight 0x34b6501e0 0x370000
341 blk.37.ffn_down.weight 0x34b9c01e0 0x6e00000
342 blk.37.ffn_gate.weight 0x3527c01e0 0x5a00000
343 blk.37.ffn_norm.weight 0x3581c01e0 0x5000
344 blk.37.ffn_up.weight 0x3581c51e0 0x5a00000

Base Tensor Group : ~1B Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type BPW
0 output.weight Output (W) (~671M) 671088640 5120 x 131072 x 1 x 1 Q5_K 5.5000
1 output_norm.weight Output Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
2 token_embd.weight Token Embedding (W) (~671M) 671088640 5120 x 131072 x 1 x 1 Q3_K 3.4375
  • Total elements in base: ( ~1B) 1342182400
  • Percentage of total elements: 5.98%
  • Bits per Weight (BPW) for base: 4.4689 bits

Block 0 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type BPW
3 blk.0.attn_k.weight Block 0 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q5_K 5.5000
4 blk.0.attn_norm.weight Block 0 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
5 blk.0.attn_output.weight Block 0 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q5_K 5.5000
6 blk.0.attn_q.weight Block 0 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q5_K 5.5000
7 blk.0.attn_v.weight Block 0 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q5_K 5.5000
8 blk.0.ffn_down.weight Block 0 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q6_K 6.5625
9 blk.0.ffn_gate.weight Block 0 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q5_K 5.5000
10 blk.0.ffn_norm.weight Block 0 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
11 blk.0.ffn_up.weight Block 0 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q5_K 5.5000
  • Total elements in blk.0: (~556M) 555755520
  • Percentage of total elements: 2.47%
  • Bits per Weight (BPW) for blk.0: 5.8212 bits

Block 1 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type BPW
12 blk.1.attn_k.weight Block 1 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q5_K 5.5000
13 blk.1.attn_norm.weight Block 1 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
14 blk.1.attn_output.weight Block 1 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q5_K 5.5000
15 blk.1.attn_q.weight Block 1 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q5_K 5.5000
16 blk.1.attn_v.weight Block 1 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q5_K 5.5000
17 blk.1.ffn_down.weight Block 1 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q6_K 6.5625
18 blk.1.ffn_gate.weight Block 1 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q5_K 5.5000
19 blk.1.ffn_norm.weight Block 1 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
20 blk.1.ffn_up.weight Block 1 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q5_K 5.5000
  • Total elements in blk.1: (~556M) 555755520
  • Percentage of total elements: 2.47%
  • Bits per Weight (BPW) for blk.1: 5.8212 bits

Block 2 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type BPW
21 blk.2.attn_k.weight Block 2 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q5_K 5.5000
22 blk.2.attn_norm.weight Block 2 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
23 blk.2.attn_output.weight Block 2 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q5_K 5.5000
24 blk.2.attn_q.weight Block 2 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q5_K 5.5000
25 blk.2.attn_v.weight Block 2 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q5_K 5.5000
26 blk.2.ffn_down.weight Block 2 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q6_K 6.5625
27 blk.2.ffn_gate.weight Block 2 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q5_K 5.5000
28 blk.2.ffn_norm.weight Block 2 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
29 blk.2.ffn_up.weight Block 2 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q5_K 5.5000
  • Total elements in blk.2: (~556M) 555755520
  • Percentage of total elements: 2.47%
  • Bits per Weight (BPW) for blk.2: 5.8212 bits

Block 3 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type BPW
30 blk.3.attn_k.weight Block 3 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q4_K 4.5000
31 blk.3.attn_norm.weight Block 3 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
32 blk.3.attn_output.weight Block 3 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q5_K 5.5000
33 blk.3.attn_q.weight Block 3 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q4_K 4.5000
34 blk.3.attn_v.weight Block 3 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q5_K 5.5000
35 blk.3.ffn_down.weight Block 3 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q6_K 6.5625
36 blk.3.ffn_gate.weight Block 3 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q5_K 5.5000
37 blk.3.ffn_norm.weight Block 3 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
38 blk.3.ffn_up.weight Block 3 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q5_K 5.5000
  • Total elements in blk.3: (~556M) 555755520
  • Percentage of total elements: 2.47%
  • Bits per Weight (BPW) for blk.3: 5.7741 bits

Block 4 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type BPW
39 blk.4.attn_k.weight Block 4 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q5_K 5.5000
40 blk.4.attn_norm.weight Block 4 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
41 blk.4.attn_output.weight Block 4 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q5_K 5.5000
42 blk.4.attn_q.weight Block 4 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q5_K 5.5000
43 blk.4.attn_v.weight Block 4 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q5_K 5.5000
44 blk.4.ffn_down.weight Block 4 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q6_K 6.5625
45 blk.4.ffn_gate.weight Block 4 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q5_K 5.5000
46 blk.4.ffn_norm.weight Block 4 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
47 blk.4.ffn_up.weight Block 4 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q5_K 5.5000
  • Total elements in blk.4: (~556M) 555755520
  • Percentage of total elements: 2.47%
  • Bits per Weight (BPW) for blk.4: 5.8212 bits

Block 5 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type BPW
48 blk.5.attn_k.weight Block 5 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q5_K 5.5000
49 blk.5.attn_norm.weight Block 5 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
50 blk.5.attn_output.weight Block 5 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q5_K 5.5000
51 blk.5.attn_q.weight Block 5 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q5_K 5.5000
52 blk.5.attn_v.weight Block 5 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q5_K 5.5000
53 blk.5.ffn_down.weight Block 5 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q6_K 6.5625
54 blk.5.ffn_gate.weight Block 5 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q5_K 5.5000
55 blk.5.ffn_norm.weight Block 5 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
56 blk.5.ffn_up.weight Block 5 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q5_K 5.5000
  • Total elements in blk.5: (~556M) 555755520
  • Percentage of total elements: 2.47%
  • Bits per Weight (BPW) for blk.5: 5.8212 bits

Block 6 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type BPW
57 blk.6.attn_k.weight Block 6 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q5_K 5.5000
58 blk.6.attn_norm.weight Block 6 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
59 blk.6.attn_output.weight Block 6 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q5_K 5.5000
60 blk.6.attn_q.weight Block 6 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q5_K 5.5000
61 blk.6.attn_v.weight Block 6 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q5_K 5.5000
62 blk.6.ffn_down.weight Block 6 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q6_K 6.5625
63 blk.6.ffn_gate.weight Block 6 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q5_K 5.5000
64 blk.6.ffn_norm.weight Block 6 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
65 blk.6.ffn_up.weight Block 6 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q5_K 5.5000
  • Total elements in blk.6: (~556M) 555755520
  • Percentage of total elements: 2.47%
  • Bits per Weight (BPW) for blk.6: 5.8212 bits

Block 7 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type BPW
66 blk.7.attn_k.weight Block 7 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q5_K 5.5000
67 blk.7.attn_norm.weight Block 7 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
68 blk.7.attn_output.weight Block 7 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q5_K 5.5000
69 blk.7.attn_q.weight Block 7 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q5_K 5.5000
70 blk.7.attn_v.weight Block 7 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q5_K 5.5000
71 blk.7.ffn_down.weight Block 7 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q6_K 6.5625
72 blk.7.ffn_gate.weight Block 7 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q5_K 5.5000
73 blk.7.ffn_norm.weight Block 7 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
74 blk.7.ffn_up.weight Block 7 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q5_K 5.5000
  • Total elements in blk.7: (~556M) 555755520
  • Percentage of total elements: 2.47%
  • Bits per Weight (BPW) for blk.7: 5.8212 bits

Block 8 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type BPW
75 blk.8.attn_k.weight Block 8 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q5_K 5.5000
76 blk.8.attn_norm.weight Block 8 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
77 blk.8.attn_output.weight Block 8 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q5_K 5.5000
78 blk.8.attn_q.weight Block 8 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q5_K 5.5000
79 blk.8.attn_v.weight Block 8 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q5_K 5.5000
80 blk.8.ffn_down.weight Block 8 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q6_K 6.5625
81 blk.8.ffn_gate.weight Block 8 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q5_K 5.5000
82 blk.8.ffn_norm.weight Block 8 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
83 blk.8.ffn_up.weight Block 8 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q5_K 5.5000
  • Total elements in blk.8: (~556M) 555755520
  • Percentage of total elements: 2.47%
  • Bits per Weight (BPW) for blk.8: 5.8212 bits

Block 9 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type BPW
84 blk.9.attn_k.weight Block 9 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q5_K 5.5000
85 blk.9.attn_norm.weight Block 9 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
86 blk.9.attn_output.weight Block 9 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q5_K 5.5000
87 blk.9.attn_q.weight Block 9 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q5_K 5.5000
88 blk.9.attn_v.weight Block 9 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q5_K 5.5000
89 blk.9.ffn_down.weight Block 9 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q6_K 6.5625
90 blk.9.ffn_gate.weight Block 9 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q5_K 5.5000
91 blk.9.ffn_norm.weight Block 9 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
92 blk.9.ffn_up.weight Block 9 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q5_K 5.5000
  • Total elements in blk.9: (~556M) 555755520
  • Percentage of total elements: 2.47%
  • Bits per Weight (BPW) for blk.9: 5.8212 bits

Block 10 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type BPW
93 blk.10.attn_k.weight Block 10 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q4_K 4.5000
94 blk.10.attn_norm.weight Block 10 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
95 blk.10.attn_output.weight Block 10 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q5_K 5.5000
96 blk.10.attn_q.weight Block 10 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q4_K 4.5000
97 blk.10.attn_v.weight Block 10 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q5_K 5.5000
98 blk.10.ffn_down.weight Block 10 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q6_K 6.5625
99 blk.10.ffn_gate.weight Block 10 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q4_K 4.5000
100 blk.10.ffn_norm.weight Block 10 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
101 blk.10.ffn_up.weight Block 10 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q4_K 4.5000
  • Total elements in blk.10: (~556M) 555755520
  • Percentage of total elements: 2.47%
  • Bits per Weight (BPW) for blk.10: 5.1703 bits

Block 11 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type BPW
102 blk.11.attn_k.weight Block 11 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q5_K 5.5000
103 blk.11.attn_norm.weight Block 11 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
104 blk.11.attn_output.weight Block 11 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q5_K 5.5000
105 blk.11.attn_q.weight Block 11 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q5_K 5.5000
106 blk.11.attn_v.weight Block 11 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q5_K 5.5000
107 blk.11.ffn_down.weight Block 11 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q6_K 6.5625
108 blk.11.ffn_gate.weight Block 11 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q4_K 4.5000
109 blk.11.ffn_norm.weight Block 11 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
110 blk.11.ffn_up.weight Block 11 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q4_K 4.5000
  • Total elements in blk.11: (~556M) 555755520
  • Percentage of total elements: 2.47%
  • Bits per Weight (BPW) for blk.11: 5.2175 bits

Block 12 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type BPW
111 blk.12.attn_k.weight Block 12 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q4_K 4.5000
112 blk.12.attn_norm.weight Block 12 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
113 blk.12.attn_output.weight Block 12 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q5_K 5.5000
114 blk.12.attn_q.weight Block 12 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q4_K 4.5000
115 blk.12.attn_v.weight Block 12 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q5_K 5.5000
116 blk.12.ffn_down.weight Block 12 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q6_K 6.5625
117 blk.12.ffn_gate.weight Block 12 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q4_K 4.5000
118 blk.12.ffn_norm.weight Block 12 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
119 blk.12.ffn_up.weight Block 12 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q4_K 4.5000
  • Total elements in blk.12: (~556M) 555755520
  • Percentage of total elements: 2.47%
  • Bits per Weight (BPW) for blk.12: 5.1703 bits

Block 13 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type BPW
120 blk.13.attn_k.weight Block 13 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q4_K 4.5000
121 blk.13.attn_norm.weight Block 13 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
122 blk.13.attn_output.weight Block 13 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q5_K 5.5000
123 blk.13.attn_q.weight Block 13 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q4_K 4.5000
124 blk.13.attn_v.weight Block 13 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q5_K 5.5000
125 blk.13.ffn_down.weight Block 13 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q6_K 6.5625
126 blk.13.ffn_gate.weight Block 13 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q4_K 4.5000
127 blk.13.ffn_norm.weight Block 13 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
128 blk.13.ffn_up.weight Block 13 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q4_K 4.5000
  • Total elements in blk.13: (~556M) 555755520
  • Percentage of total elements: 2.47%
  • Bits per Weight (BPW) for blk.13: 5.1703 bits

Block 14 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type BPW
129 blk.14.attn_k.weight Block 14 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q4_K 4.5000
130 blk.14.attn_norm.weight Block 14 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
131 blk.14.attn_output.weight Block 14 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q5_K 5.5000
132 blk.14.attn_q.weight Block 14 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q4_K 4.5000
133 blk.14.attn_v.weight Block 14 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q5_K 5.5000
134 blk.14.ffn_down.weight Block 14 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q6_K 6.5625
135 blk.14.ffn_gate.weight Block 14 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q4_K 4.5000
136 blk.14.ffn_norm.weight Block 14 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
137 blk.14.ffn_up.weight Block 14 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q4_K 4.5000
  • Total elements in blk.14: (~556M) 555755520
  • Percentage of total elements: 2.47%
  • Bits per Weight (BPW) for blk.14: 5.1703 bits

Block 15 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type BPW
138 blk.15.attn_k.weight Block 15 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q4_K 4.5000
139 blk.15.attn_norm.weight Block 15 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
140 blk.15.attn_output.weight Block 15 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q5_K 5.5000
141 blk.15.attn_q.weight Block 15 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q4_K 4.5000
142 blk.15.attn_v.weight Block 15 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q5_K 5.5000
143 blk.15.ffn_down.weight Block 15 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q6_K 6.5625
144 blk.15.ffn_gate.weight Block 15 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q4_K 4.5000
145 blk.15.ffn_norm.weight Block 15 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
146 blk.15.ffn_up.weight Block 15 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q4_K 4.5000
  • Total elements in blk.15: (~556M) 555755520
  • Percentage of total elements: 2.47%
  • Bits per Weight (BPW) for blk.15: 5.1703 bits

Block 16 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type BPW
147 blk.16.attn_k.weight Block 16 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q4_K 4.5000
148 blk.16.attn_norm.weight Block 16 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
149 blk.16.attn_output.weight Block 16 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q5_K 5.5000
150 blk.16.attn_q.weight Block 16 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q4_K 4.5000
151 blk.16.attn_v.weight Block 16 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q5_K 5.5000
152 blk.16.ffn_down.weight Block 16 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q6_K 6.5625
153 blk.16.ffn_gate.weight Block 16 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q4_K 4.5000
154 blk.16.ffn_norm.weight Block 16 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
155 blk.16.ffn_up.weight Block 16 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q4_K 4.5000
  • Total elements in blk.16: (~556M) 555755520
  • Percentage of total elements: 2.47%
  • Bits per Weight (BPW) for blk.16: 5.1703 bits

Block 17 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type BPW
156 blk.17.attn_k.weight Block 17 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q4_K 4.5000
157 blk.17.attn_norm.weight Block 17 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
158 blk.17.attn_output.weight Block 17 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q5_K 5.5000
159 blk.17.attn_q.weight Block 17 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q4_K 4.5000
160 blk.17.attn_v.weight Block 17 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q5_K 5.5000
161 blk.17.ffn_down.weight Block 17 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q6_K 6.5625
162 blk.17.ffn_gate.weight Block 17 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q4_K 4.5000
163 blk.17.ffn_norm.weight Block 17 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
164 blk.17.ffn_up.weight Block 17 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q4_K 4.5000
  • Total elements in blk.17: (~556M) 555755520
  • Percentage of total elements: 2.47%
  • Bits per Weight (BPW) for blk.17: 5.1703 bits

Block 18 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type BPW
165 blk.18.attn_k.weight Block 18 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q4_K 4.5000
166 blk.18.attn_norm.weight Block 18 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
167 blk.18.attn_output.weight Block 18 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q5_K 5.5000
168 blk.18.attn_q.weight Block 18 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q4_K 4.5000
169 blk.18.attn_v.weight Block 18 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q5_K 5.5000
170 blk.18.ffn_down.weight Block 18 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q6_K 6.5625
171 blk.18.ffn_gate.weight Block 18 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q4_K 4.5000
172 blk.18.ffn_norm.weight Block 18 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
173 blk.18.ffn_up.weight Block 18 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q4_K 4.5000
  • Total elements in blk.18: (~556M) 555755520
  • Percentage of total elements: 2.47%
  • Bits per Weight (BPW) for blk.18: 5.1703 bits

Block 19 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type BPW
174 blk.19.attn_k.weight Block 19 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q4_K 4.5000
175 blk.19.attn_norm.weight Block 19 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
176 blk.19.attn_output.weight Block 19 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q5_K 5.5000
177 blk.19.attn_q.weight Block 19 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q4_K 4.5000
178 blk.19.attn_v.weight Block 19 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q5_K 5.5000
179 blk.19.ffn_down.weight Block 19 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q6_K 6.5625
180 blk.19.ffn_gate.weight Block 19 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q4_K 4.5000
181 blk.19.ffn_norm.weight Block 19 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
182 blk.19.ffn_up.weight Block 19 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q4_K 4.5000
  • Total elements in blk.19: (~556M) 555755520
  • Percentage of total elements: 2.47%
  • Bits per Weight (BPW) for blk.19: 5.1703 bits

Block 20 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type BPW
183 blk.20.attn_k.weight Block 20 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q4_K 4.5000
184 blk.20.attn_norm.weight Block 20 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
185 blk.20.attn_output.weight Block 20 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q5_K 5.5000
186 blk.20.attn_q.weight Block 20 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q4_K 4.5000
187 blk.20.attn_v.weight Block 20 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q5_K 5.5000
188 blk.20.ffn_down.weight Block 20 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q5_K 5.5000
189 blk.20.ffn_gate.weight Block 20 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q4_K 4.5000
190 blk.20.ffn_norm.weight Block 20 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
191 blk.20.ffn_up.weight Block 20 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q4_K 4.5000
  • Total elements in blk.20: (~556M) 555755520
  • Percentage of total elements: 2.47%
  • Bits per Weight (BPW) for blk.20: 4.8496 bits

Block 21 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type BPW
192 blk.21.attn_k.weight Block 21 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q4_K 4.5000
193 blk.21.attn_norm.weight Block 21 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
194 blk.21.attn_output.weight Block 21 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q5_K 5.5000
195 blk.21.attn_q.weight Block 21 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q4_K 4.5000
196 blk.21.attn_v.weight Block 21 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q5_K 5.5000
197 blk.21.ffn_down.weight Block 21 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q5_K 5.5000
198 blk.21.ffn_gate.weight Block 21 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q4_K 4.5000
199 blk.21.ffn_norm.weight Block 21 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
200 blk.21.ffn_up.weight Block 21 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q4_K 4.5000
  • Total elements in blk.21: (~556M) 555755520
  • Percentage of total elements: 2.47%
  • Bits per Weight (BPW) for blk.21: 4.8496 bits

Block 22 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type BPW
201 blk.22.attn_k.weight Block 22 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q4_K 4.5000
202 blk.22.attn_norm.weight Block 22 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
203 blk.22.attn_output.weight Block 22 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q5_K 5.5000
204 blk.22.attn_q.weight Block 22 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q4_K 4.5000
205 blk.22.attn_v.weight Block 22 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q5_K 5.5000
206 blk.22.ffn_down.weight Block 22 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q5_K 5.5000
207 blk.22.ffn_gate.weight Block 22 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q4_K 4.5000
208 blk.22.ffn_norm.weight Block 22 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
209 blk.22.ffn_up.weight Block 22 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q4_K 4.5000
  • Total elements in blk.22: (~556M) 555755520
  • Percentage of total elements: 2.47%
  • Bits per Weight (BPW) for blk.22: 4.8496 bits

Block 23 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type BPW
210 blk.23.attn_k.weight Block 23 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q4_K 4.5000
211 blk.23.attn_norm.weight Block 23 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
212 blk.23.attn_output.weight Block 23 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q5_K 5.5000
213 blk.23.attn_q.weight Block 23 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q4_K 4.5000
214 blk.23.attn_v.weight Block 23 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q5_K 5.5000
215 blk.23.ffn_down.weight Block 23 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q5_K 5.5000
216 blk.23.ffn_gate.weight Block 23 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q4_K 4.5000
217 blk.23.ffn_norm.weight Block 23 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
218 blk.23.ffn_up.weight Block 23 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q4_K 4.5000
  • Total elements in blk.23: (~556M) 555755520
  • Percentage of total elements: 2.47%
  • Bits per Weight (BPW) for blk.23: 4.8496 bits

Block 24 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type BPW
219 blk.24.attn_k.weight Block 24 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q4_K 4.5000
220 blk.24.attn_norm.weight Block 24 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
221 blk.24.attn_output.weight Block 24 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q5_K 5.5000
222 blk.24.attn_q.weight Block 24 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q4_K 4.5000
223 blk.24.attn_v.weight Block 24 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q5_K 5.5000
224 blk.24.ffn_down.weight Block 24 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q5_K 5.5000
225 blk.24.ffn_gate.weight Block 24 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q4_K 4.5000
226 blk.24.ffn_norm.weight Block 24 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
227 blk.24.ffn_up.weight Block 24 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q4_K 4.5000
  • Total elements in blk.24: (~556M) 555755520
  • Percentage of total elements: 2.47%
  • Bits per Weight (BPW) for blk.24: 4.8496 bits

Block 25 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type BPW
228 blk.25.attn_k.weight Block 25 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q4_K 4.5000
229 blk.25.attn_norm.weight Block 25 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
230 blk.25.attn_output.weight Block 25 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q5_K 5.5000
231 blk.25.attn_q.weight Block 25 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q4_K 4.5000
232 blk.25.attn_v.weight Block 25 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q5_K 5.5000
233 blk.25.ffn_down.weight Block 25 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q5_K 5.5000
234 blk.25.ffn_gate.weight Block 25 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q4_K 4.5000
235 blk.25.ffn_norm.weight Block 25 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
236 blk.25.ffn_up.weight Block 25 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q4_K 4.5000
  • Total elements in blk.25: (~556M) 555755520
  • Percentage of total elements: 2.47%
  • Bits per Weight (BPW) for blk.25: 4.8496 bits

Block 26 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type BPW
237 blk.26.attn_k.weight Block 26 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q4_K 4.5000
238 blk.26.attn_norm.weight Block 26 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
239 blk.26.attn_output.weight Block 26 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q5_K 5.5000
240 blk.26.attn_q.weight Block 26 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q4_K 4.5000
241 blk.26.attn_v.weight Block 26 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q5_K 5.5000
242 blk.26.ffn_down.weight Block 26 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q5_K 5.5000
243 blk.26.ffn_gate.weight Block 26 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q4_K 4.5000
244 blk.26.ffn_norm.weight Block 26 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
245 blk.26.ffn_up.weight Block 26 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q4_K 4.5000
  • Total elements in blk.26: (~556M) 555755520
  • Percentage of total elements: 2.47%
  • Bits per Weight (BPW) for blk.26: 4.8496 bits

Block 27 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type BPW
246 blk.27.attn_k.weight Block 27 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q4_K 4.5000
247 blk.27.attn_norm.weight Block 27 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
248 blk.27.attn_output.weight Block 27 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q5_K 5.5000
249 blk.27.attn_q.weight Block 27 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q4_K 4.5000
250 blk.27.attn_v.weight Block 27 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q5_K 5.5000
251 blk.27.ffn_down.weight Block 27 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q5_K 5.5000
252 blk.27.ffn_gate.weight Block 27 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q4_K 4.5000
253 blk.27.ffn_norm.weight Block 27 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
254 blk.27.ffn_up.weight Block 27 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q4_K 4.5000
  • Total elements in blk.27: (~556M) 555755520
  • Percentage of total elements: 2.47%
  • Bits per Weight (BPW) for blk.27: 4.8496 bits

Block 28 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type BPW
255 blk.28.attn_k.weight Block 28 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q4_K 4.5000
256 blk.28.attn_norm.weight Block 28 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
257 blk.28.attn_output.weight Block 28 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q5_K 5.5000
258 blk.28.attn_q.weight Block 28 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q4_K 4.5000
259 blk.28.attn_v.weight Block 28 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q5_K 5.5000
260 blk.28.ffn_down.weight Block 28 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q5_K 5.5000
261 blk.28.ffn_gate.weight Block 28 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q4_K 4.5000
262 blk.28.ffn_norm.weight Block 28 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
263 blk.28.ffn_up.weight Block 28 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q4_K 4.5000
  • Total elements in blk.28: (~556M) 555755520
  • Percentage of total elements: 2.47%
  • Bits per Weight (BPW) for blk.28: 4.8496 bits

Block 29 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type BPW
264 blk.29.attn_k.weight Block 29 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q4_K 4.5000
265 blk.29.attn_norm.weight Block 29 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
266 blk.29.attn_output.weight Block 29 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q5_K 5.5000
267 blk.29.attn_q.weight Block 29 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q4_K 4.5000
268 blk.29.attn_v.weight Block 29 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q5_K 5.5000
269 blk.29.ffn_down.weight Block 29 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q5_K 5.5000
270 blk.29.ffn_gate.weight Block 29 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q4_K 4.5000
271 blk.29.ffn_norm.weight Block 29 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
272 blk.29.ffn_up.weight Block 29 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q4_K 4.5000
  • Total elements in blk.29: (~556M) 555755520
  • Percentage of total elements: 2.47%
  • Bits per Weight (BPW) for blk.29: 4.8496 bits

Block 30 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type BPW
273 blk.30.attn_k.weight Block 30 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q4_K 4.5000
274 blk.30.attn_norm.weight Block 30 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
275 blk.30.attn_output.weight Block 30 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q5_K 5.5000
276 blk.30.attn_q.weight Block 30 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q4_K 4.5000
277 blk.30.attn_v.weight Block 30 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q5_K 5.5000
278 blk.30.ffn_down.weight Block 30 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q5_K 5.5000
279 blk.30.ffn_gate.weight Block 30 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q4_K 4.5000
280 blk.30.ffn_norm.weight Block 30 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
281 blk.30.ffn_up.weight Block 30 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q4_K 4.5000
  • Total elements in blk.30: (~556M) 555755520
  • Percentage of total elements: 2.47%
  • Bits per Weight (BPW) for blk.30: 4.8496 bits

Block 31 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type BPW
282 blk.31.attn_k.weight Block 31 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q4_K 4.5000
283 blk.31.attn_norm.weight Block 31 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
284 blk.31.attn_output.weight Block 31 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q5_K 5.5000
285 blk.31.attn_q.weight Block 31 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q4_K 4.5000
286 blk.31.attn_v.weight Block 31 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q5_K 5.5000
287 blk.31.ffn_down.weight Block 31 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q5_K 5.5000
288 blk.31.ffn_gate.weight Block 31 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q4_K 4.5000
289 blk.31.ffn_norm.weight Block 31 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
290 blk.31.ffn_up.weight Block 31 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q4_K 4.5000
  • Total elements in blk.31: (~556M) 555755520
  • Percentage of total elements: 2.47%
  • Bits per Weight (BPW) for blk.31: 4.8496 bits

Block 32 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type BPW
291 blk.32.attn_k.weight Block 32 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q4_K 4.5000
292 blk.32.attn_norm.weight Block 32 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
293 blk.32.attn_output.weight Block 32 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q5_K 5.5000
294 blk.32.attn_q.weight Block 32 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q4_K 4.5000
295 blk.32.attn_v.weight Block 32 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q5_K 5.5000
296 blk.32.ffn_down.weight Block 32 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q5_K 5.5000
297 blk.32.ffn_gate.weight Block 32 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q4_K 4.5000
298 blk.32.ffn_norm.weight Block 32 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
299 blk.32.ffn_up.weight Block 32 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q4_K 4.5000
  • Total elements in blk.32: (~556M) 555755520
  • Percentage of total elements: 2.47%
  • Bits per Weight (BPW) for blk.32: 4.8496 bits

Block 33 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type BPW
300 blk.33.attn_k.weight Block 33 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q4_K 4.5000
301 blk.33.attn_norm.weight Block 33 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
302 blk.33.attn_output.weight Block 33 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q5_K 5.5000
303 blk.33.attn_q.weight Block 33 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q4_K 4.5000
304 blk.33.attn_v.weight Block 33 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q5_K 5.5000
305 blk.33.ffn_down.weight Block 33 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q5_K 5.5000
306 blk.33.ffn_gate.weight Block 33 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q4_K 4.5000
307 blk.33.ffn_norm.weight Block 33 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
308 blk.33.ffn_up.weight Block 33 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q4_K 4.5000
  • Total elements in blk.33: (~556M) 555755520
  • Percentage of total elements: 2.47%
  • Bits per Weight (BPW) for blk.33: 4.8496 bits

Block 34 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type BPW
309 blk.34.attn_k.weight Block 34 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q4_K 4.5000
310 blk.34.attn_norm.weight Block 34 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
311 blk.34.attn_output.weight Block 34 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q5_K 5.5000
312 blk.34.attn_q.weight Block 34 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q4_K 4.5000
313 blk.34.attn_v.weight Block 34 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q5_K 5.5000
314 blk.34.ffn_down.weight Block 34 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q5_K 5.5000
315 blk.34.ffn_gate.weight Block 34 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q4_K 4.5000
316 blk.34.ffn_norm.weight Block 34 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
317 blk.34.ffn_up.weight Block 34 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q4_K 4.5000
  • Total elements in blk.34: (~556M) 555755520
  • Percentage of total elements: 2.47%
  • Bits per Weight (BPW) for blk.34: 4.8496 bits

Block 35 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type BPW
318 blk.35.attn_k.weight Block 35 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q4_K 4.5000
319 blk.35.attn_norm.weight Block 35 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
320 blk.35.attn_output.weight Block 35 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q5_K 5.5000
321 blk.35.attn_q.weight Block 35 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q4_K 4.5000
322 blk.35.attn_v.weight Block 35 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q5_K 5.5000
323 blk.35.ffn_down.weight Block 35 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q5_K 5.5000
324 blk.35.ffn_gate.weight Block 35 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q4_K 4.5000
325 blk.35.ffn_norm.weight Block 35 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
326 blk.35.ffn_up.weight Block 35 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q4_K 4.5000
  • Total elements in blk.35: (~556M) 555755520
  • Percentage of total elements: 2.47%
  • Bits per Weight (BPW) for blk.35: 4.8496 bits

Block 36 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type BPW
327 blk.36.attn_k.weight Block 36 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q4_K 4.5000
328 blk.36.attn_norm.weight Block 36 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
329 blk.36.attn_output.weight Block 36 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q5_K 5.5000
330 blk.36.attn_q.weight Block 36 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q4_K 4.5000
331 blk.36.attn_v.weight Block 36 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q5_K 5.5000
332 blk.36.ffn_down.weight Block 36 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q5_K 5.5000
333 blk.36.ffn_gate.weight Block 36 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q4_K 4.5000
334 blk.36.ffn_norm.weight Block 36 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
335 blk.36.ffn_up.weight Block 36 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q4_K 4.5000
  • Total elements in blk.36: (~556M) 555755520
  • Percentage of total elements: 2.47%
  • Bits per Weight (BPW) for blk.36: 4.8496 bits

Block 37 Tensor Group : ~556M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type BPW
336 blk.37.attn_k.weight Block 37 Attention Key (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q4_K 4.5000
337 blk.37.attn_norm.weight Block 37 Attention Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
338 blk.37.attn_output.weight Block 37 Attention Output (W) ( ~21M) 20971520 4096 x 5120 x 1 x 1 Q5_K 5.5000
339 blk.37.attn_q.weight Block 37 Attention Query (W) ( ~21M) 20971520 5120 x 4096 x 1 x 1 Q4_K 4.5000
340 blk.37.attn_v.weight Block 37 Attention Value (W) ( ~5M) 5242880 5120 x 1024 x 1 x 1 Q5_K 5.5000
341 blk.37.ffn_down.weight Block 37 Feed-Forward Network "Down" (W) (~168M) 167772160 32768 x 5120 x 1 x 1 Q5_K 5.5000
342 blk.37.ffn_gate.weight Block 37 Feed-Forward Network "Gate" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q4_K 4.5000
343 blk.37.ffn_norm.weight Block 37 Feed-Forward Network Normalization (W) ( ~5K) 5120 5120 x 1 x 1 x 1 F32 32.0000
344 blk.37.ffn_up.weight Block 37 Feed-Forward Network "Up" (W) (~168M) 167772160 5120 x 32768 x 1 x 1 Q4_K 4.5000
  • Total elements in blk.37: (~556M) 555755520
  • Percentage of total elements: 2.47%
  • Bits per Weight (BPW) for blk.37: 4.8496 bits

Total BPW for Mistral-Small-3.2-24B-Instruct-pruned-Q5_K_S.gguf: 5.1466 bits