MOSAIC: Multi-Subject Personalized Generation via Correspondence-Aware Alignment and Disentanglement Paper • 2509.01977 • Published Sep 2 • 12
view article Article Accelerate Large Model Training using PyTorch Fully Sharded Data Parallel May 2, 2022 • 9
view article Article makeMoE: Implement a Sparse Mixture of Experts Language Model from Scratch By AviSoori1x • May 7, 2024 • 102