Transformers
English
norabelrose's picture
Create README.md
f35f64c verified
|
raw
history blame
812 Bytes
metadata
license: mit
datasets:
  - togethercomputer/RedPajama-Data-1T-Sample
language:
  - en
library_name: transformers

This is a set of sparse autoencoders (SAEs) trained on the residual stream of Llama 3 8B using the 10B sample of the RedPajama v2 corpus, which comes out to roughly 8.5B tokens using the Llama 3 tokenizer. The SAEs are organized by layer, and can be loaded using the EleutherAI sae library.

These are early checkpoints of an ongoing training run which can be tracked here. They will be updated as the training run progresses. The last upload was at 7,000 steps.