Update README.md
Browse files
README.md
CHANGED
|
@@ -111,8 +111,8 @@ model = fully_shard(model, **fsdp_kwargs)
|
|
| 111 |
| Llama-3.1-8B-Instruct-w16a8-4nodes-bs32 | 31478468 | 39.75 | 4 | 4 | **2.650** | **10.600** | 4 | 4 | 8 | 512 |
|
| 112 |
| Llama-3.1-8B-Instruct-w16a16-8nodes-bs32 | 31476914 | 22.00 | 8 | 4 | **2.933** | **11.733** | 4 | 4 | 8 | 1024 |
|
| 113 |
| Llama-3.1-8B-Instruct-w16a8-8nodes-bs32 | 31476844 | 23.50 | 8 | 4 | **3.133** | **12.533** | 4 | 4 | 8 | 1024 |
|
| 114 |
-
| Llama-3.1-8B-Instruct-w16a16-8nodes-bs64 | 31476914 | 22.00 | 8 | 4 | **2.933** | **11.733** | 4 |
|
| 115 |
-
| Llama-3.1-8B-Instruct-w16a8-8nodes-bs64 | 31476844 | 23.50 | 8 | 4 | **3.133** | **12.533** | 4 |
|
| 116 |
|
| 117 |
# *All 6-models trained on(1Node,4Noes,8Nodes with both bfp16-fp8 && bfp16 configurations)*
|
| 118 |
| perplexity metric results for bfp16 && bfp16-fp8 configurations | Accuracy metric results for bfp16 && bfp16-fp8 configurations | Loss metric results for bfp16 && bfp16-fp8 configurations | Memory allocation for bfp16 && bfp16-fp8 configurations | Utilization for bfp16 && bfp16-fp8 configurations |
|
|
|
|
| 111 |
| Llama-3.1-8B-Instruct-w16a8-4nodes-bs32 | 31478468 | 39.75 | 4 | 4 | **2.650** | **10.600** | 4 | 4 | 8 | 512 |
|
| 112 |
| Llama-3.1-8B-Instruct-w16a16-8nodes-bs32 | 31476914 | 22.00 | 8 | 4 | **2.933** | **11.733** | 4 | 4 | 8 | 1024 |
|
| 113 |
| Llama-3.1-8B-Instruct-w16a8-8nodes-bs32 | 31476844 | 23.50 | 8 | 4 | **3.133** | **12.533** | 4 | 4 | 8 | 1024 |
|
| 114 |
+
| Llama-3.1-8B-Instruct-w16a16-8nodes-bs64 | 31476914 | 22.00 | 8 | 4 | **2.933** | **11.733** | 4 | 8 | 8 | 1024 |
|
| 115 |
+
| Llama-3.1-8B-Instruct-w16a8-8nodes-bs64 | 31476844 | 23.50 | 8 | 4 | **3.133** | **12.533** | 4 | 8 | 8 | 1024 |
|
| 116 |
|
| 117 |
# *All 6-models trained on(1Node,4Noes,8Nodes with both bfp16-fp8 && bfp16 configurations)*
|
| 118 |
| perplexity metric results for bfp16 && bfp16-fp8 configurations | Accuracy metric results for bfp16 && bfp16-fp8 configurations | Loss metric results for bfp16 && bfp16-fp8 configurations | Memory allocation for bfp16 && bfp16-fp8 configurations | Utilization for bfp16 && bfp16-fp8 configurations |
|