Datasets:

Languages:
English
ArXiv:
License:
bhatta1 commited on
Commit
bbb4ee2
·
verified ·
1 Parent(s): dd79ed8

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -1
README.md CHANGED
@@ -103,4 +103,6 @@ Given than training models of size `7 Billion` parameters require lot more compu
103
 
104
  4. [Notebook](https://github.com/IBM/data-prep-kit/blob/dev/examples/notebooks/GneissWeb/GneissWeb.ipynb) to recreate GneissWeb using the methods described above
105
 
106
- 5. [Notebook](https://github.com/ian-cho/data-prep-kit/blob/dev/transforms/universal/bloom/bloom_python.ipynb) to recreate GneissWeb using a bloom filter built on the document ids of GneissWeb
 
 
 
103
 
104
  4. [Notebook](https://github.com/IBM/data-prep-kit/blob/dev/examples/notebooks/GneissWeb/GneissWeb.ipynb) to recreate GneissWeb using the methods described above
105
 
106
+ 5. [Notebook](https://github.com/ian-cho/data-prep-kit/blob/dev/transforms/universal/bloom/bloom_python.ipynb) to recreate GneissWeb using a bloom filter built on the document ids of GneissWeb
107
+
108
+ 6. [Blog](https://research.ibm.com/blog/gneissweb-for-granite-training) and [Paper](https://huggingface.co/datasets/ibm-granite/GneissWeb/blob/main/GneissWebPaper_Feb21_2025.pdf)