Optimum documentation
Single-HPU Training
You are viewing main version, which requires installation from source. If you'd like
regular pip install, checkout the latest stable version (v1.27.0).
Single-HPU Training
Training on a single device is as simple as in Transformers:
- You need to replace the Transformers’
Trainerclass with theGaudiTrainerclass, - You need to replace the Transformers’
TrainingArgumentsclass with the GaudiTrainingArguments class and add the following arguments:use_habanato execute your script on an HPU,use_lazy_modeto use lazy mode (recommended) or not (i.e. eager mode),gaudi_config_nameto give the name of (Hub) or the path to (local) your Gaudi configuration file.
To go further, we invite you to read our guides about accelerating training and pretraining.
Update on GitHub