Keyurjotaniya007's picture
Update README.md
5342a54 verified
metadata
license: apache-2.0
datasets:
  - stanfordnlp/imdb
language:
  - en
metrics:
  - perplexity
base_model:
  - google-bert/bert-base-uncased
pipeline_tag: fill-mask
tags:
  - generatedtext

MLM

This model is a fine-tuned version of google-bert/bert-base-uncased on a imdb dataset. It achieves the following results on the evaluation set:

• Eval loss: 4.18

• Perplexity (PPL): 50.58

Training hyperparameters

The following hyperparameters were used during training:

• learning_rate: 3e-5

• train_batch_size: 4

• gradient_accumulation_steps=2

• seed: 42

• weight_decay=0.01

• lr_scheduler_type: linear

• warmup_ratio=0.1

• num_epochs: 2