--- library_name: transformers license: apache-2.0 base_model: google/vit-hybrid-base-bit-384 tags: - generated_from_trainer metrics: - accuracy model-index: - name: vit-hybrid-base-bit-384_rice-leaf-disease-augmented-v4_v5_pft results: [] --- # vit-hybrid-base-bit-384_rice-leaf-disease-augmented-v4_v5_pft This model is a fine-tuned version of [google/vit-hybrid-base-bit-384](https://huggingface.co/google/vit-hybrid-base-bit-384) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.2706 - Accuracy: 0.9195 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0003 - train_batch_size: 64 - eval_batch_size: 64 - seed: 42 - optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments - lr_scheduler_type: cosine_with_restarts - lr_scheduler_warmup_steps: 256 - num_epochs: 30 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 2.2177 | 0.5 | 64 | 2.0234 | 0.2248 | | 1.7719 | 1.0 | 128 | 1.4579 | 0.5872 | | 1.258 | 1.5 | 192 | 1.0302 | 0.7081 | | 0.9323 | 2.0 | 256 | 0.7450 | 0.8154 | | 0.7252 | 2.5 | 320 | 0.6527 | 0.7953 | | 0.6134 | 3.0 | 384 | 0.5488 | 0.8154 | | 0.5375 | 3.5 | 448 | 0.5004 | 0.8221 | | 0.5028 | 4.0 | 512 | 0.4624 | 0.8557 | | 0.471 | 4.5 | 576 | 0.4532 | 0.8557 | | 0.44 | 5.0 | 640 | 0.4378 | 0.8557 | | 0.4302 | 5.5 | 704 | 0.4446 | 0.8423 | | 0.4267 | 6.0 | 768 | 0.4300 | 0.8591 | | 0.4341 | 6.5 | 832 | 0.4302 | 0.8591 | | 0.406 | 7.0 | 896 | 0.4174 | 0.8658 | | 0.4077 | 7.5 | 960 | 0.3973 | 0.8523 | | 0.3639 | 8.0 | 1024 | 0.3747 | 0.8792 | | 0.3463 | 8.5 | 1088 | 0.3701 | 0.8859 | | 0.343 | 9.0 | 1152 | 0.3682 | 0.8859 | | 0.322 | 9.5 | 1216 | 0.3567 | 0.8792 | | 0.3224 | 10.0 | 1280 | 0.3555 | 0.8859 | | 0.3103 | 10.5 | 1344 | 0.3529 | 0.8859 | | 0.314 | 11.0 | 1408 | 0.3531 | 0.8859 | | 0.3153 | 11.5 | 1472 | 0.3546 | 0.8859 | | 0.3033 | 12.0 | 1536 | 0.3434 | 0.8792 | | 0.2905 | 12.5 | 1600 | 0.3326 | 0.8859 | | 0.2857 | 13.0 | 1664 | 0.3323 | 0.8893 | | 0.2693 | 13.5 | 1728 | 0.3238 | 0.8893 | | 0.2683 | 14.0 | 1792 | 0.3273 | 0.9027 | | 0.2582 | 14.5 | 1856 | 0.3243 | 0.9060 | | 0.2544 | 15.0 | 1920 | 0.3181 | 0.8993 | | 0.2478 | 15.5 | 1984 | 0.3167 | 0.8993 | | 0.255 | 16.0 | 2048 | 0.3166 | 0.8993 | | 0.2586 | 16.5 | 2112 | 0.3087 | 0.8993 | | 0.24 | 17.0 | 2176 | 0.3126 | 0.9060 | | 0.2351 | 17.5 | 2240 | 0.3032 | 0.9027 | | 0.2302 | 18.0 | 2304 | 0.3005 | 0.9094 | | 0.2229 | 18.5 | 2368 | 0.2993 | 0.9128 | | 0.2185 | 19.0 | 2432 | 0.2982 | 0.9027 | | 0.2138 | 19.5 | 2496 | 0.2968 | 0.9027 | | 0.2128 | 20.0 | 2560 | 0.2952 | 0.9027 | | 0.2134 | 20.5 | 2624 | 0.2946 | 0.9027 | | 0.2107 | 21.0 | 2688 | 0.3014 | 0.8993 | | 0.2077 | 21.5 | 2752 | 0.2885 | 0.9060 | | 0.2073 | 22.0 | 2816 | 0.2911 | 0.9094 | | 0.1943 | 22.5 | 2880 | 0.2853 | 0.9128 | | 0.1979 | 23.0 | 2944 | 0.2806 | 0.9094 | | 0.1907 | 23.5 | 3008 | 0.2793 | 0.9161 | | 0.1848 | 24.0 | 3072 | 0.2794 | 0.9094 | | 0.1884 | 24.5 | 3136 | 0.2780 | 0.9094 | | 0.179 | 25.0 | 3200 | 0.2778 | 0.9094 | | 0.1872 | 25.5 | 3264 | 0.2828 | 0.9128 | | 0.181 | 26.0 | 3328 | 0.2749 | 0.9128 | | 0.1779 | 26.5 | 3392 | 0.2752 | 0.9094 | | 0.1783 | 27.0 | 3456 | 0.2730 | 0.9128 | | 0.1777 | 27.5 | 3520 | 0.2720 | 0.9195 | | 0.162 | 28.0 | 3584 | 0.2717 | 0.9128 | | 0.1682 | 28.5 | 3648 | 0.2679 | 0.9128 | | 0.1599 | 29.0 | 3712 | 0.2709 | 0.9128 | | 0.1587 | 29.5 | 3776 | 0.2711 | 0.9161 | | 0.164 | 30.0 | 3840 | 0.2706 | 0.9195 | ### Framework versions - Transformers 4.48.3 - Pytorch 2.5.1+cu124 - Datasets 3.3.2 - Tokenizers 0.21.1