boatbomber commited on
Commit
f4f39fa
·
verified ·
1 Parent(s): 005d8b3

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +31 -30
README.md CHANGED
@@ -1,30 +1,31 @@
1
- ---
2
- license: gemma
3
- license_link: https://ai.google.dev/gemma/terms
4
- base_model: google/gemma-3-27b-it
5
- finetuned_by: boatbomber
6
- pipeline_tag: text-generation
7
- tags:
8
- - chat
9
- - roblox
10
- - luau
11
- language:
12
- - en
13
- datasets:
14
- - boatbomber/roblox-info-dump
15
- - boatbomber/the-luau-stack
16
- ---
17
-
18
- # Gemma-3-27B-Roblox-Luau
19
-
20
- A fine tune of [google/gemma-3-27b-it](google/gemma-3-27b-it) using [boatbomber/roblox-info-dump](https://huggingface.co/datasets/boatbomber/roblox-info-dump) and [boatbomber/the-luau-stack](https://huggingface.co/datasets/boatbomber/the-luau-stack) for Roblox domain knowledge.
21
-
22
- Available quants:
23
-
24
- | Quant | Size | Notes |
25
- | ------ | ------- | ----- |
26
- | Q8_O | 30.21GB | High resource use, but generally acceptable. Use only when accuracy is crucial. |
27
- | Q6_K | 23.32GB | Uses Q6_K for all tensors. Good for high end GPUs. |
28
- | Q5_K_M | 20.24GB | Uses Q6_K for half of the attention.wv and feed_forward.w2 tensors, else Q5_K |
29
- | Q4_K_M | 17.34GB | **Recommended.** Uses Q6_K for half of the attention.wv and feed_forward.w2 tensors, else Q4_K |
30
- | Q3_K_M | 14.04GB | Uses Q4_K for the attention.wv, attention.wo, and feed_forward.w2 tensors, else Q3_K. Quality is noticeably degraded. |
 
 
1
+ ---
2
+ license: gemma
3
+ license_link: https://ai.google.dev/gemma/terms
4
+ base_model: google/gemma-3-27b-it
5
+ base_model_relation: finetune
6
+ finetuned_by: boatbomber
7
+ pipeline_tag: text-generation
8
+ tags:
9
+ - chat
10
+ - roblox
11
+ - luau
12
+ language:
13
+ - en
14
+ datasets:
15
+ - boatbomber/roblox-info-dump
16
+ - boatbomber/the-luau-stack
17
+ ---
18
+
19
+ # Gemma-3-27B-Roblox-Luau
20
+
21
+ A fine tune of [google/gemma-3-27b-it](google/gemma-3-27b-it) using [boatbomber/roblox-info-dump](https://huggingface.co/datasets/boatbomber/roblox-info-dump) and [boatbomber/the-luau-stack](https://huggingface.co/datasets/boatbomber/the-luau-stack) for Roblox domain knowledge.
22
+
23
+ Available quants:
24
+
25
+ | Quant | Size | Notes |
26
+ | ------ | ------- | ----- |
27
+ | Q8_O | 30.21GB | High resource use, but generally acceptable. Use only when accuracy is crucial. |
28
+ | Q6_K | 23.32GB | Uses Q6_K for all tensors. Good for high end GPUs. |
29
+ | Q5_K_M | 20.24GB | Uses Q6_K for half of the attention.wv and feed_forward.w2 tensors, else Q5_K |
30
+ | Q4_K_M | 17.34GB | **Recommended.** Uses Q6_K for half of the attention.wv and feed_forward.w2 tensors, else Q4_K |
31
+ | Q3_K_M | 14.04GB | Uses Q4_K for the attention.wv, attention.wo, and feed_forward.w2 tensors, else Q3_K. Quality is noticeably degraded. |