Upload ViT model from experiment c1
Browse filesThis view is limited to 50 files because it contains too many changes.
See raw diff
- .gitattributes +2 -0
- README.md +166 -0
- config.json +76 -0
- confusion_matrices/ViT_Confusion_Matrix_a.png +0 -0
- confusion_matrices/ViT_Confusion_Matrix_b.png +0 -0
- confusion_matrices/ViT_Confusion_Matrix_c.png +0 -0
- confusion_matrices/ViT_Confusion_Matrix_d.png +0 -0
- confusion_matrices/ViT_Confusion_Matrix_e.png +0 -0
- confusion_matrices/ViT_Confusion_Matrix_f.png +0 -0
- confusion_matrices/ViT_Confusion_Matrix_g.png +0 -0
- confusion_matrices/ViT_Confusion_Matrix_h.png +0 -0
- confusion_matrices/ViT_Confusion_Matrix_i.png +0 -0
- confusion_matrices/ViT_Confusion_Matrix_j.png +0 -0
- confusion_matrices/ViT_Confusion_Matrix_k.png +0 -0
- confusion_matrices/ViT_Confusion_Matrix_l.png +0 -0
- evaluation_results.csv +133 -0
- model.safetensors +3 -0
- pytorch_model.bin +3 -0
- roc_confusion_matrix/ViT_roc_confusion_matrix_a.png +0 -0
- roc_confusion_matrix/ViT_roc_confusion_matrix_b.png +0 -0
- roc_confusion_matrix/ViT_roc_confusion_matrix_c.png +0 -0
- roc_confusion_matrix/ViT_roc_confusion_matrix_d.png +0 -0
- roc_confusion_matrix/ViT_roc_confusion_matrix_e.png +0 -0
- roc_confusion_matrix/ViT_roc_confusion_matrix_f.png +0 -0
- roc_confusion_matrix/ViT_roc_confusion_matrix_g.png +0 -0
- roc_confusion_matrix/ViT_roc_confusion_matrix_h.png +0 -0
- roc_confusion_matrix/ViT_roc_confusion_matrix_i.png +0 -0
- roc_confusion_matrix/ViT_roc_confusion_matrix_j.png +0 -0
- roc_confusion_matrix/ViT_roc_confusion_matrix_k.png +0 -0
- roc_confusion_matrix/ViT_roc_confusion_matrix_l.png +0 -0
- roc_curves/ViT_ROC_a.png +0 -0
- roc_curves/ViT_ROC_b.png +0 -0
- roc_curves/ViT_ROC_c.png +0 -0
- roc_curves/ViT_ROC_d.png +0 -0
- roc_curves/ViT_ROC_e.png +0 -0
- roc_curves/ViT_ROC_f.png +0 -0
- roc_curves/ViT_ROC_g.png +0 -0
- roc_curves/ViT_ROC_h.png +0 -0
- roc_curves/ViT_ROC_i.png +0 -0
- roc_curves/ViT_ROC_j.png +0 -0
- roc_curves/ViT_ROC_k.png +0 -0
- roc_curves/ViT_ROC_l.png +0 -0
- training_curves/ViT_accuracy.png +0 -0
- training_curves/ViT_auc.png +0 -0
- training_curves/ViT_combined_metrics.png +3 -0
- training_curves/ViT_f1.png +0 -0
- training_curves/ViT_loss.png +0 -0
- training_curves/ViT_metrics.csv +36 -0
- training_metrics.csv +36 -0
- training_notebook_c1.ipynb +3 -0
.gitattributes
CHANGED
|
@@ -33,3 +33,5 @@ saved_model/**/* filter=lfs diff=lfs merge=lfs -text
|
|
| 33 |
*.zip filter=lfs diff=lfs merge=lfs -text
|
| 34 |
*.zst filter=lfs diff=lfs merge=lfs -text
|
| 35 |
*tfevents* filter=lfs diff=lfs merge=lfs -text
|
|
|
|
|
|
|
|
|
| 33 |
*.zip filter=lfs diff=lfs merge=lfs -text
|
| 34 |
*.zst filter=lfs diff=lfs merge=lfs -text
|
| 35 |
*tfevents* filter=lfs diff=lfs merge=lfs -text
|
| 36 |
+
training_curves/ViT_combined_metrics.png filter=lfs diff=lfs merge=lfs -text
|
| 37 |
+
training_notebook_c1.ipynb filter=lfs diff=lfs merge=lfs -text
|
README.md
ADDED
|
@@ -0,0 +1,166 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
---
|
| 2 |
+
license: apache-2.0
|
| 3 |
+
tags:
|
| 4 |
+
- image-classification
|
| 5 |
+
- pytorch
|
| 6 |
+
- timm
|
| 7 |
+
- vit
|
| 8 |
+
- vision-transformer
|
| 9 |
+
- transformer
|
| 10 |
+
- gravitational-lensing
|
| 11 |
+
- strong-lensing
|
| 12 |
+
- astronomy
|
| 13 |
+
- astrophysics
|
| 14 |
+
datasets:
|
| 15 |
+
- parlange/gravit-c21-j24
|
| 16 |
+
metrics:
|
| 17 |
+
- accuracy
|
| 18 |
+
- auc
|
| 19 |
+
- f1
|
| 20 |
+
paper:
|
| 21 |
+
- title: "GraViT: A Gravitational Lens Discovery Toolkit with Vision Transformers"
|
| 22 |
+
url: "https://arxiv.org/abs/2509.00226"
|
| 23 |
+
authors: "Parlange et al."
|
| 24 |
+
model-index:
|
| 25 |
+
- name: ViT-c1
|
| 26 |
+
results:
|
| 27 |
+
- task:
|
| 28 |
+
type: image-classification
|
| 29 |
+
name: Strong Gravitational Lens Discovery
|
| 30 |
+
dataset:
|
| 31 |
+
type: common-test-sample
|
| 32 |
+
name: Common Test Sample (More et al. 2024)
|
| 33 |
+
metrics:
|
| 34 |
+
- type: accuracy
|
| 35 |
+
value: 0.8028
|
| 36 |
+
name: Average Accuracy
|
| 37 |
+
- type: auc
|
| 38 |
+
value: 0.8562
|
| 39 |
+
name: Average AUC-ROC
|
| 40 |
+
- type: f1
|
| 41 |
+
value: 0.5517
|
| 42 |
+
name: Average F1-Score
|
| 43 |
+
---
|
| 44 |
+
|
| 45 |
+
# 🌌 vit-gravit-c1
|
| 46 |
+
|
| 47 |
+
🔭 This model is part of **GraViT**: Transfer Learning with Vision Transformers and MLP-Mixer for Strong Gravitational Lens Discovery
|
| 48 |
+
|
| 49 |
+
🔗 **GitHub Repository**: [https://github.com/parlange/gravit](https://github.com/parlange/gravit)
|
| 50 |
+
|
| 51 |
+
## 🛰️ Model Details
|
| 52 |
+
|
| 53 |
+
- **🤖 Model Type**: ViT
|
| 54 |
+
- **🧪 Experiment**: C1 - C21+J24-classification-head
|
| 55 |
+
- **🌌 Dataset**: C21+J24
|
| 56 |
+
- **🪐 Fine-tuning Strategy**: classification-head
|
| 57 |
+
|
| 58 |
+
|
| 59 |
+
|
| 60 |
+
## 💻 Quick Start
|
| 61 |
+
|
| 62 |
+
```python
|
| 63 |
+
import torch
|
| 64 |
+
import timm
|
| 65 |
+
|
| 66 |
+
# Load the model directly from the Hub
|
| 67 |
+
model = timm.create_model(
|
| 68 |
+
'hf-hub:parlange/vit-gravit-c1',
|
| 69 |
+
pretrained=True
|
| 70 |
+
)
|
| 71 |
+
model.eval()
|
| 72 |
+
|
| 73 |
+
# Example inference
|
| 74 |
+
dummy_input = torch.randn(1, 3, 224, 224)
|
| 75 |
+
with torch.no_grad():
|
| 76 |
+
output = model(dummy_input)
|
| 77 |
+
predictions = torch.softmax(output, dim=1)
|
| 78 |
+
print(f"Lens probability: {predictions[0][1]:.4f}")
|
| 79 |
+
```
|
| 80 |
+
|
| 81 |
+
## ⚡️ Training Configuration
|
| 82 |
+
|
| 83 |
+
**Training Dataset:** C21+J24 (Cañameras et al. 2021 + Jaelani et al. 2024)
|
| 84 |
+
**Fine-tuning Strategy:** classification-head
|
| 85 |
+
|
| 86 |
+
|
| 87 |
+
| 🔧 Parameter | 📝 Value |
|
| 88 |
+
|--------------|----------|
|
| 89 |
+
| Batch Size | 192 |
|
| 90 |
+
| Learning Rate | AdamW with ReduceLROnPlateau |
|
| 91 |
+
| Epochs | 100 |
|
| 92 |
+
| Patience | 10 |
|
| 93 |
+
| Optimizer | AdamW |
|
| 94 |
+
| Scheduler | ReduceLROnPlateau |
|
| 95 |
+
| Image Size | 224x224 |
|
| 96 |
+
| Fine Tune Mode | classification_head |
|
| 97 |
+
| Stochastic Depth Probability | 0.1 |
|
| 98 |
+
|
| 99 |
+
|
| 100 |
+
## 📈 Training Curves
|
| 101 |
+
|
| 102 |
+

|
| 103 |
+
|
| 104 |
+
|
| 105 |
+
## 🏁 Final Epoch Training Metrics
|
| 106 |
+
|
| 107 |
+
| Metric | Training | Validation |
|
| 108 |
+
|:---------:|:-----------:|:-------------:|
|
| 109 |
+
| 📉 Loss | 0.3578 | 0.3397 |
|
| 110 |
+
| 🎯 Accuracy | 0.8453 | 0.8666 |
|
| 111 |
+
| 📊 AUC-ROC | 0.9223 | 0.9388 |
|
| 112 |
+
| ⚖️ F1 Score | 0.8453 | 0.8721 |
|
| 113 |
+
|
| 114 |
+
|
| 115 |
+
## ☑️ Evaluation Results
|
| 116 |
+
|
| 117 |
+
### ROC Curves and Confusion Matrices
|
| 118 |
+
|
| 119 |
+
Performance across all test datasets (a through l) in the Common Test Sample (More et al. 2024):
|
| 120 |
+
|
| 121 |
+

|
| 122 |
+

|
| 123 |
+

|
| 124 |
+

|
| 125 |
+

|
| 126 |
+

|
| 127 |
+

|
| 128 |
+

|
| 129 |
+

|
| 130 |
+

|
| 131 |
+

|
| 132 |
+

|
| 133 |
+
|
| 134 |
+
### 📋 Performance Summary
|
| 135 |
+
|
| 136 |
+
Average performance across 12 test datasets from the Common Test Sample (More et al. 2024):
|
| 137 |
+
|
| 138 |
+
| Metric | Value |
|
| 139 |
+
|-----------|----------|
|
| 140 |
+
| 🎯 Average Accuracy | 0.8028 |
|
| 141 |
+
| 📈 Average AUC-ROC | 0.8562 |
|
| 142 |
+
| ⚖️ Average F1-Score | 0.5517 |
|
| 143 |
+
|
| 144 |
+
|
| 145 |
+
## 📘 Citation
|
| 146 |
+
|
| 147 |
+
If you use this model in your research, please cite:
|
| 148 |
+
|
| 149 |
+
```bibtex
|
| 150 |
+
@misc{parlange2025gravit,
|
| 151 |
+
title={GraViT: Transfer Learning with Vision Transformers and MLP-Mixer for Strong Gravitational Lens Discovery},
|
| 152 |
+
author={René Parlange and Juan C. Cuevas-Tello and Octavio Valenzuela and Omar de J. Cabrera-Rosas and Tomás Verdugo and Anupreeta More and Anton T. Jaelani},
|
| 153 |
+
year={2025},
|
| 154 |
+
eprint={2509.00226},
|
| 155 |
+
archivePrefix={arXiv},
|
| 156 |
+
primaryClass={cs.CV},
|
| 157 |
+
url={https://arxiv.org/abs/2509.00226},
|
| 158 |
+
}
|
| 159 |
+
```
|
| 160 |
+
|
| 161 |
+
---
|
| 162 |
+
|
| 163 |
+
|
| 164 |
+
## Model Card Contact
|
| 165 |
+
|
| 166 |
+
For questions about this model, please contact the author through: https://github.com/parlange/
|
config.json
ADDED
|
@@ -0,0 +1,76 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"architecture": "vit_base_patch16_224",
|
| 3 |
+
"num_classes": 2,
|
| 4 |
+
"num_features": 768,
|
| 5 |
+
"global_pool": "token",
|
| 6 |
+
"crop_pct": 0.875,
|
| 7 |
+
"interpolation": "bicubic",
|
| 8 |
+
"mean": [
|
| 9 |
+
0.485,
|
| 10 |
+
0.456,
|
| 11 |
+
0.406
|
| 12 |
+
],
|
| 13 |
+
"std": [
|
| 14 |
+
0.229,
|
| 15 |
+
0.224,
|
| 16 |
+
0.225
|
| 17 |
+
],
|
| 18 |
+
"first_conv": "patch_embed.proj",
|
| 19 |
+
"classifier": "head",
|
| 20 |
+
"input_size": [
|
| 21 |
+
3,
|
| 22 |
+
224,
|
| 23 |
+
224
|
| 24 |
+
],
|
| 25 |
+
"pool_size": [
|
| 26 |
+
7,
|
| 27 |
+
7
|
| 28 |
+
],
|
| 29 |
+
"pretrained_cfg": {
|
| 30 |
+
"tag": "gravit_c1",
|
| 31 |
+
"custom_load": false,
|
| 32 |
+
"input_size": [
|
| 33 |
+
3,
|
| 34 |
+
224,
|
| 35 |
+
224
|
| 36 |
+
],
|
| 37 |
+
"fixed_input_size": true,
|
| 38 |
+
"interpolation": "bicubic",
|
| 39 |
+
"crop_pct": 0.875,
|
| 40 |
+
"crop_mode": "center",
|
| 41 |
+
"mean": [
|
| 42 |
+
0.485,
|
| 43 |
+
0.456,
|
| 44 |
+
0.406
|
| 45 |
+
],
|
| 46 |
+
"std": [
|
| 47 |
+
0.229,
|
| 48 |
+
0.224,
|
| 49 |
+
0.225
|
| 50 |
+
],
|
| 51 |
+
"num_classes": 2,
|
| 52 |
+
"pool_size": [
|
| 53 |
+
7,
|
| 54 |
+
7
|
| 55 |
+
],
|
| 56 |
+
"first_conv": "patch_embed.proj",
|
| 57 |
+
"classifier": "head"
|
| 58 |
+
},
|
| 59 |
+
"model_name": "vit_gravit_c1",
|
| 60 |
+
"experiment": "c1",
|
| 61 |
+
"training_strategy": "classification-head",
|
| 62 |
+
"dataset": "C21+J24",
|
| 63 |
+
"hyperparameters": {
|
| 64 |
+
"batch_size": "192",
|
| 65 |
+
"learning_rate": "AdamW with ReduceLROnPlateau",
|
| 66 |
+
"epochs": "100",
|
| 67 |
+
"patience": "10",
|
| 68 |
+
"optimizer": "AdamW",
|
| 69 |
+
"scheduler": "ReduceLROnPlateau",
|
| 70 |
+
"image_size": "224x224",
|
| 71 |
+
"fine_tune_mode": "classification_head",
|
| 72 |
+
"stochastic_depth_probability": "0.1"
|
| 73 |
+
},
|
| 74 |
+
"hf_hub_id": "parlange/vit-gravit-c1",
|
| 75 |
+
"license": "apache-2.0"
|
| 76 |
+
}
|
confusion_matrices/ViT_Confusion_Matrix_a.png
ADDED
|
confusion_matrices/ViT_Confusion_Matrix_b.png
ADDED
|
confusion_matrices/ViT_Confusion_Matrix_c.png
ADDED
|
confusion_matrices/ViT_Confusion_Matrix_d.png
ADDED
|
confusion_matrices/ViT_Confusion_Matrix_e.png
ADDED
|
confusion_matrices/ViT_Confusion_Matrix_f.png
ADDED
|
confusion_matrices/ViT_Confusion_Matrix_g.png
ADDED
|
confusion_matrices/ViT_Confusion_Matrix_h.png
ADDED
|
confusion_matrices/ViT_Confusion_Matrix_i.png
ADDED
|
confusion_matrices/ViT_Confusion_Matrix_j.png
ADDED
|
confusion_matrices/ViT_Confusion_Matrix_k.png
ADDED
|
confusion_matrices/ViT_Confusion_Matrix_l.png
ADDED
|
evaluation_results.csv
ADDED
|
@@ -0,0 +1,133 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
Model,Dataset,Loss,Accuracy,AUCROC,F1
|
| 2 |
+
ViT,a,0.4575504291374072,0.7920353982300885,0.8256229797574562,0.5688073394495413
|
| 3 |
+
ViT,b,0.35027294130266856,0.854133920150896,0.8602099447513812,0.34831460674157305
|
| 4 |
+
ViT,c,0.5165887196384036,0.7544797233574347,0.786182320441989,0.24101068999028183
|
| 5 |
+
ViT,d,0.2510994746836054,0.9016032694121345,0.9034125230202578,0.44206773618538325
|
| 6 |
+
ViT,e,0.4143968988721117,0.8177826564215148,0.8443351244986,0.5990338164251208
|
| 7 |
+
ViT,f,0.3670418868224962,0.841640022569118,0.8478622806743998,0.12836438923395446
|
| 8 |
+
ViT,g,0.2497027156352997,0.9033333333333333,0.9734338888888888,0.9069618222649984
|
| 9 |
+
ViT,h,0.3378777855237325,0.8505,0.9529537222222222,0.8630743397954511
|
| 10 |
+
ViT,i,0.19712424822648367,0.9285,0.9841957777777779,0.9294755877034359
|
| 11 |
+
ViT,j,0.7854293506145478,0.5983333333333334,0.6951524444444445,0.4527702089009991
|
| 12 |
+
ViT,k,0.7328508781194687,0.6235,0.7771344444444446,0.46884552080884084
|
| 13 |
+
ViT,l,0.487411656158395,0.7679451725381748,0.8238313924596774,0.6716570261993875
|
| 14 |
+
MLP-Mixer,a,0.3996943971224591,0.8086283185840708,0.8391867831243361,0.5685785536159601
|
| 15 |
+
MLP-Mixer,b,0.3412993300058526,0.8704809808236403,0.8364769797421732,0.35625
|
| 16 |
+
MLP-Mixer,c,0.3957880755634062,0.8242690977679975,0.80653591160221,0.289707750952986
|
| 17 |
+
MLP-Mixer,d,0.30740130081636957,0.8868280414963848,0.8505101289134438,0.3877551020408163
|
| 18 |
+
MLP-Mixer,e,0.4771155259938978,0.7793633369923162,0.7879361235147202,0.5314685314685315
|
| 19 |
+
MLP-Mixer,f,0.3381275081289247,0.8648674064321986,0.8287089328554598,0.13693693693693693
|
| 20 |
+
MLP-Mixer,g,0.26878669277826944,0.9036666666666666,0.9679408888888887,0.9054319371727748
|
| 21 |
+
MLP-Mixer,h,0.29767482439676923,0.8791666666666667,0.9592756666666666,0.8841667998082761
|
| 22 |
+
MLP-Mixer,i,0.2508150982062022,0.9123333333333333,0.9736637777777777,0.9132013201320132
|
| 23 |
+
MLP-Mixer,j,0.6541107538541158,0.6641666666666667,0.7743557777777779,0.5689839572192513
|
| 24 |
+
MLP-Mixer,k,0.6361391656398773,0.6728333333333333,0.7940376666666666,0.5753839498161367
|
| 25 |
+
MLP-Mixer,l,0.4342376798394926,0.7992064446314777,0.8634144024748618,0.716034687978235
|
| 26 |
+
CvT,a,0.6301151535152334,0.6548672566371682,0.7049165921612678,0.41353383458646614
|
| 27 |
+
CvT,b,0.6147656698190952,0.6692863879283244,0.7044972375690608,0.17295597484276728
|
| 28 |
+
CvT,c,0.6944268911336811,0.6001257466205596,0.6591786372007367,0.14745308310991956
|
| 29 |
+
CvT,d,0.2650476947487693,0.9000314366551398,0.8999226519337017,0.40892193308550184
|
| 30 |
+
CvT,e,0.6509704625828991,0.6465422612513722,0.6885869976538257,0.4059040590405904
|
| 31 |
+
CvT,f,0.5340885097132559,0.7186383298852737,0.7464956265694429,0.0684931506849315
|
| 32 |
+
CvT,g,0.42030701192220055,0.8028333333333333,0.9186497222222223,0.825490485322319
|
| 33 |
+
CvT,h,0.4625407474835714,0.7661666666666667,0.8991485555555555,0.799542791827404
|
| 34 |
+
CvT,i,0.2348982002735138,0.9251666666666667,0.9787946666666666,0.9257237386269644
|
| 35 |
+
CvT,j,1.2669325167338052,0.438,0.3431898888888889,0.265359477124183
|
| 36 |
+
CvT,k,1.08152370317777,0.5603333333333333,0.6091516666666666,0.3158713692946058
|
| 37 |
+
CvT,l,0.7296624132198569,0.6642419141517374,0.6727565192585612,0.5574134242016008
|
| 38 |
+
Swin,a,0.3518127097492724,0.8573008849557522,0.92351161137984,0.7074829931972789
|
| 39 |
+
Swin,b,0.3482047942909419,0.8509902546369066,0.9245285451197054,0.3969465648854962
|
| 40 |
+
Swin,c,0.3979874673372393,0.8255265639735933,0.9125395948434623,0.35986159169550175
|
| 41 |
+
Swin,d,0.17529730912425615,0.9415278214397989,0.9671712707182321,0.6265060240963856
|
| 42 |
+
Swin,e,0.5681350649908005,0.7387486278814489,0.8579505032922122,0.5672727272727273
|
| 43 |
+
Swin,f,0.329795688788694,0.8606356968215159,0.9286062369152528,0.17391304347826086
|
| 44 |
+
Swin,g,0.2527379404703776,0.9035,0.9724114444444445,0.9083715777812945
|
| 45 |
+
Swin,h,0.2791310551166534,0.89,0.9682955555555556,0.896875
|
| 46 |
+
Swin,i,0.16106815997759502,0.9515,0.9900552222222222,0.9517492953075776
|
| 47 |
+
Swin,j,0.907479268391927,0.5803333333333334,0.6390204444444445,0.42511415525114155
|
| 48 |
+
Swin,k,0.8158094821770986,0.6283333333333333,0.7901601111111112,0.4550342130987292
|
| 49 |
+
Swin,l,0.503966703916006,0.7787062642779848,0.8190682128691764,0.682535575679172
|
| 50 |
+
CaiT,a,0.42890300766556666,0.8130530973451328,0.8644880523906682,0.6060606060606061
|
| 51 |
+
CaiT,b,0.3192856568973119,0.8893429739075762,0.903804788213628,0.42483660130718953
|
| 52 |
+
CaiT,c,0.48953707897974164,0.7730273498899717,0.8254953959484346,0.26476578411405294
|
| 53 |
+
CaiT,d,0.31145547537899587,0.867966048412449,0.9025340699815839,0.38235294117647056
|
| 54 |
+
CaiT,e,0.4548946529397849,0.7870472008781558,0.8492999318852645,0.5726872246696035
|
| 55 |
+
CaiT,f,0.3738367850840574,0.8445552003009216,0.8744395460236903,0.1359121798222687
|
| 56 |
+
CaiT,g,0.3011642297108968,0.8948333333333334,0.9556216666666666,0.8943225590353374
|
| 57 |
+
CaiT,h,0.39142585277557373,0.8331666666666667,0.9202581666666666,0.8421384639646743
|
| 58 |
+
CaiT,i,0.2970129141012828,0.8835,0.9525210000000002,0.8842523596621957
|
| 59 |
+
CaiT,j,0.7035526345570882,0.661,0.7364068888888889,0.5547285464098074
|
| 60 |
+
CaiT,k,0.6994013291200002,0.6496666666666666,0.7619980555555556,0.5465918895599655
|
| 61 |
+
CaiT,l,0.4910164130440666,0.7766021401947818,0.8205470707913558,0.6864135021097046
|
| 62 |
+
DeiT,a,0.4880231645254962,0.7798672566371682,0.8260661913604304,0.5664488017429193
|
| 63 |
+
DeiT,b,0.31241170634304494,0.8758252121974222,0.8908674033149171,0.3969465648854962
|
| 64 |
+
DeiT,c,0.5076334938917547,0.7563659226658284,0.8106298342541435,0.25120772946859904
|
| 65 |
+
DeiT,d,0.21307693347358284,0.9264382269726501,0.9316758747697976,0.5263157894736842
|
| 66 |
+
DeiT,e,0.41276343642814756,0.8155872667398463,0.8614016498902596,0.6074766355140186
|
| 67 |
+
DeiT,f,0.347494895568754,0.852642467556893,0.8730114223466999,0.1423097974822113
|
| 68 |
+
DeiT,g,0.21456254084904988,0.9195,0.9812000000000001,0.9221595487510073
|
| 69 |
+
DeiT,h,0.31806263717015587,0.8561666666666666,0.9641542222222222,0.8689445709946849
|
| 70 |
+
DeiT,i,0.1618985648949941,0.9463333333333334,0.9902026666666668,0.9467240238252813
|
| 71 |
+
DeiT,j,0.7059943490028381,0.6493333333333333,0.7624447777777776,0.5410122164048866
|
| 72 |
+
DeiT,k,0.6533303724129995,0.6761666666666667,0.8324687777777777,0.5607054035722361
|
| 73 |
+
DeiT,l,0.4469874652336176,0.7916315979319466,0.8543558192306258,0.7094232059020792
|
| 74 |
+
DeiT3,a,0.4202683992617953,0.8163716814159292,0.8467825130097888,0.5829145728643216
|
| 75 |
+
DeiT3,b,0.262984538234768,0.9012889028607356,0.8990220994475138,0.4249084249084249
|
| 76 |
+
DeiT3,c,0.472756382524124,0.7783715812637535,0.7983001841620626,0.24759871931696906
|
| 77 |
+
DeiT3,d,0.33147037092037673,0.862936183590066,0.8655248618784529,0.3473053892215569
|
| 78 |
+
DeiT3,e,0.38950261964363536,0.814489571899012,0.86056913645652,0.5785536159600998
|
| 79 |
+
DeiT3,f,0.34437218432882033,0.8561218732367877,0.8542026846822373,0.13166855845629966
|
| 80 |
+
DeiT3,g,0.21329958017667133,0.9278333333333333,0.9798244444444444,0.9286067600989283
|
| 81 |
+
DeiT3,h,0.32451361187299094,0.8626666666666667,0.9527946111111111,0.872366790582404
|
| 82 |
+
DeiT3,i,0.24960849436124166,0.9075,0.9715454444444445,0.9102957814772911
|
| 83 |
+
DeiT3,j,0.5672995191415151,0.7125,0.8462789444444444,0.6385920804525456
|
| 84 |
+
DeiT3,k,0.6036084276040395,0.6921666666666667,0.800755,0.62267620020429
|
| 85 |
+
DeiT3,l,0.4151913604799968,0.8082241192737766,0.8763639318146963,0.7364072054205917
|
| 86 |
+
Twins_SVT,a,0.4067131507713183,0.8396017699115044,0.8804971611532671,0.6472019464720195
|
| 87 |
+
Twins_SVT,b,0.3369882810569267,0.8682804149638479,0.8973793738489871,0.3883211678832117
|
| 88 |
+
Twins_SVT,c,0.44650013615072465,0.808236403646652,0.8531749539594844,0.3036529680365297
|
| 89 |
+
Twins_SVT,d,0.11182147282515634,0.9789374410562716,0.9822780847145488,0.7987987987987988
|
| 90 |
+
Twins_SVT,e,0.48889580673234523,0.7694840834248079,0.8373949897827897,0.5588235294117647
|
| 91 |
+
Twins_SVT,f,0.30908069689963935,0.8816061688922324,0.903702339279268,0.17442622950819672
|
| 92 |
+
Twins_SVT,g,0.25526987624168396,0.9126666666666666,0.9751330000000001,0.9157285300739788
|
| 93 |
+
Twins_SVT,h,0.3133294117450714,0.8808333333333334,0.9663294444444444,0.8884381338742393
|
| 94 |
+
Twins_SVT,i,0.13589392483234405,0.9713333333333334,0.9974112222222222,0.9706784861916127
|
| 95 |
+
Twins_SVT,j,1.2727711579799652,0.49183333333333334,0.44312866666666667,0.1743839696723531
|
| 96 |
+
Twins_SVT,k,1.1533952040274937,0.5505,0.7285452777777779,0.19275665968272973
|
| 97 |
+
Twins_SVT,l,0.6311305313460368,0.7541180714199832,0.7295359094064597,0.6175425472227417
|
| 98 |
+
Twins_PCPVT,a,0.47225049058947943,0.7533185840707964,0.8392020662830595,0.5363825363825364
|
| 99 |
+
Twins_PCPVT,b,0.3277859269766941,0.8707953473750393,0.9017642725598527,0.38565022421524664
|
| 100 |
+
Twins_PCPVT,c,0.5325269242833524,0.7318453316567117,0.8134834254143647,0.23222322232223222
|
| 101 |
+
Twins_PCPVT,d,0.34038559299375754,0.8726815466834329,0.8983720073664826,0.3891402714932127
|
| 102 |
+
Twins_PCPVT,e,0.43271679039451605,0.8024149286498353,0.8658064027851359,0.589041095890411
|
| 103 |
+
Twins_PCPVT,f,0.40193462960922294,0.8247131841263871,0.8686157929759784,0.12158341187558906
|
| 104 |
+
Twins_PCPVT,g,0.28006591550509136,0.9015,0.9626466666666665,0.9035417006691693
|
| 105 |
+
Twins_PCPVT,h,0.3886127746899923,0.8278333333333333,0.9278962222222221,0.8427462323032425
|
| 106 |
+
Twins_PCPVT,i,0.28674582862854003,0.9025,0.9641564444444444,0.9044273811468714
|
| 107 |
+
Twins_PCPVT,j,0.5901691876252493,0.6993333333333334,0.8068135555555556,0.6328856328856329
|
| 108 |
+
Twins_PCPVT,k,0.5968491020202636,0.7003333333333334,0.7921438888888889,0.6336593317033414
|
| 109 |
+
Twins_PCPVT,l,0.4564983546913021,0.7871227606107971,0.8525991250300863,0.7154680594616312
|
| 110 |
+
PiT,a,0.3924222976233052,0.831858407079646,0.8797864942726362,0.6513761467889908
|
| 111 |
+
PiT,b,0.31899622544594736,0.8758252121974222,0.9015782688766113,0.4182621502209131
|
| 112 |
+
PiT,c,0.454398755605998,0.7959761081420936,0.8498250460405157,0.30439442658092175
|
| 113 |
+
PiT,d,0.14505015389513048,0.9597610814209369,0.9646924493554327,0.6893203883495146
|
| 114 |
+
PiT,e,0.3891645081087734,0.827661909989023,0.8838870809051691,0.6439909297052154
|
| 115 |
+
PiT,f,0.3070308875887381,0.8753996614632311,0.9020960965500401,0.17650714729645742
|
| 116 |
+
PiT,g,0.22340119377772014,0.9178333333333333,0.9791551111111111,0.9207268049525648
|
| 117 |
+
PiT,h,0.2951871022383372,0.8755,0.9678674444444444,0.8845975590916113
|
| 118 |
+
PiT,i,0.13118078410625458,0.9623333333333334,0.9941626666666668,0.9620295698924731
|
| 119 |
+
PiT,j,1.1235656504631042,0.5436666666666666,0.5612956111111111,0.3110216406643181
|
| 120 |
+
PiT,k,1.0313452566862107,0.5881666666666666,0.7460636666666667,0.33342325330455896
|
| 121 |
+
PiT,l,0.5711348351591465,0.7689070578333533,0.7784629383878745,0.6533814247069432
|
| 122 |
+
Ensemble,a,,0.8628318584070797,0.9054087098721564,0.6836734693877551
|
| 123 |
+
Ensemble,b,,0.9182646966362779,0.931486187845304,0.5075757575757576
|
| 124 |
+
Ensemble,c,,0.8340144608613643,0.8768360957642726,0.33668341708542715
|
| 125 |
+
Ensemble,d,,0.9682489783087079,0.9682486187845304,0.7262872628726287
|
| 126 |
+
Ensemble,e,,0.8309549945115258,0.8953757662907742,0.6350710900473934
|
| 127 |
+
Ensemble,f,,0.9079368064698138,0.922026931389281,0.214915797914996
|
| 128 |
+
Ensemble,g,,0.9488333333333333,0.9909529999999999,0.9498284033338781
|
| 129 |
+
Ensemble,h,,0.9041666666666667,0.9819358888888887,0.9099733834351026
|
| 130 |
+
Ensemble,i,,0.9753333333333334,0.9972660000000001,0.9751677852348993
|
| 131 |
+
Ensemble,j,,0.5953333333333334,0.7440818888888889,0.39269634817408705
|
| 132 |
+
Ensemble,k,,0.6218333333333333,0.8487693333333335,0.408960666840323
|
| 133 |
+
Ensemble,l,,0.802332571840808,0.8579947764427092,0.6993965990126165
|
model.safetensors
ADDED
|
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:b3ef5c4fd58e320b23726b5612ec2180d0eb81eba5a3265b7c74ac67d1318ada
|
| 3 |
+
size 343214864
|
pytorch_model.bin
ADDED
|
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:c2e0c4ab92be37dec3f4b1f575a69daa65b8ec3b8778968adea53a5692e2c759
|
| 3 |
+
size 343259038
|
roc_confusion_matrix/ViT_roc_confusion_matrix_a.png
ADDED
|
roc_confusion_matrix/ViT_roc_confusion_matrix_b.png
ADDED
|
roc_confusion_matrix/ViT_roc_confusion_matrix_c.png
ADDED
|
roc_confusion_matrix/ViT_roc_confusion_matrix_d.png
ADDED
|
roc_confusion_matrix/ViT_roc_confusion_matrix_e.png
ADDED
|
roc_confusion_matrix/ViT_roc_confusion_matrix_f.png
ADDED
|
roc_confusion_matrix/ViT_roc_confusion_matrix_g.png
ADDED
|
roc_confusion_matrix/ViT_roc_confusion_matrix_h.png
ADDED
|
roc_confusion_matrix/ViT_roc_confusion_matrix_i.png
ADDED
|
roc_confusion_matrix/ViT_roc_confusion_matrix_j.png
ADDED
|
roc_confusion_matrix/ViT_roc_confusion_matrix_k.png
ADDED
|
roc_confusion_matrix/ViT_roc_confusion_matrix_l.png
ADDED
|
roc_curves/ViT_ROC_a.png
ADDED
|
roc_curves/ViT_ROC_b.png
ADDED
|
roc_curves/ViT_ROC_c.png
ADDED
|
roc_curves/ViT_ROC_d.png
ADDED
|
roc_curves/ViT_ROC_e.png
ADDED
|
roc_curves/ViT_ROC_f.png
ADDED
|
roc_curves/ViT_ROC_g.png
ADDED
|
roc_curves/ViT_ROC_h.png
ADDED
|
roc_curves/ViT_ROC_i.png
ADDED
|
roc_curves/ViT_ROC_j.png
ADDED
|
roc_curves/ViT_ROC_k.png
ADDED
|
roc_curves/ViT_ROC_l.png
ADDED
|
training_curves/ViT_accuracy.png
ADDED
|
training_curves/ViT_auc.png
ADDED
|
training_curves/ViT_combined_metrics.png
ADDED
|
Git LFS Details
|
training_curves/ViT_f1.png
ADDED
|
training_curves/ViT_loss.png
ADDED
|
training_curves/ViT_metrics.csv
ADDED
|
@@ -0,0 +1,36 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
epoch,train_loss,val_loss,train_accuracy,val_accuracy,train_auc,val_auc,train_f1,val_f1
|
| 2 |
+
1,0.5323386875501054,0.4755117762888138,0.7353011594896878,0.7711370262390671,0.81348677488557,0.885203019150184,0.7361674564483688,0.7931488801054019
|
| 3 |
+
2,0.43781517835478373,0.4284092179192746,0.8014929712350788,0.8192419825072886,0.8830454485545415,0.9047898834669228,0.8016473141431488,0.8299039780521262
|
| 4 |
+
3,0.41427445929300877,0.4113463647337766,0.815388035708178,0.8221574344023324,0.8951539429236054,0.9151671497420293,0.8154827020374675,0.8349120433017592
|
| 5 |
+
4,0.4023083928608962,0.40359135593339235,0.8228101378390396,0.8192419825072886,0.9012251468093784,0.9179381040212836,0.8228828336011488,0.8322056833558863
|
| 6 |
+
5,0.3954562406641524,0.38841467365926624,0.8263587235352464,0.8301749271137027,0.9046182985878689,0.9236914465911312,0.8263661470847263,0.8409556313993174
|
| 7 |
+
6,0.3872764221428153,0.3705469915540156,0.8305058658549098,0.8542274052478134,0.9087822170114311,0.9251704221880339,0.8304159608507434,0.8585572842998586
|
| 8 |
+
7,0.384070718126755,0.39311501870349963,0.8321219687382426,0.8163265306122449,0.9101693789104086,0.9269139559197275,0.8320286098063876,0.8328912466843501
|
| 9 |
+
8,0.3805712183671957,0.35730582684191603,0.8340116975065841,0.8651603498542274,0.9117416777986673,0.9309142024156603,0.8339662663792808,0.8687012065294535
|
| 10 |
+
9,0.3776779572743212,0.357522688349899,0.8343024250094059,0.8556851311953353,0.9130646675394666,0.932449489583422,0.8342825867583423,0.861731843575419
|
| 11 |
+
10,0.374850532579802,0.3752483752482834,0.8371327427574649,0.8345481049562682,0.9144322412803798,0.9314040068338874,0.8371090643199836,0.8469318948078219
|
| 12 |
+
11,0.3697774252474999,0.34939942624061515,0.8396039265314499,0.8680758017492711,0.9168827815351508,0.9343598330627545,0.8393401623899415,0.8722653493295696
|
| 13 |
+
12,0.3697715133579435,0.3443179043666962,0.8393217498375346,0.8658892128279884,0.9166875088673295,0.936127803891236,0.8393066351967299,0.8702397743300423
|
| 14 |
+
13,0.36852555068119747,0.3601747573551562,0.8397150870472346,0.8513119533527697,0.9173666724458207,0.9336883441423217,0.8394831262470136,0.859504132231405
|
| 15 |
+
14,0.3676681494243527,0.35326992744259517,0.8404504566131956,0.8564139941690962,0.9176733040671543,0.9359365570468088,0.8402742704525805,0.8632893823733518
|
| 16 |
+
15,0.3650746907724517,0.3343106554739677,0.8426907685467045,0.8731778425655977,0.9190811037822717,0.9378840874125578,0.8424954838488737,0.8760683760683761
|
| 17 |
+
16,0.3633183905273226,0.36100645167834555,0.8436655607620481,0.8462099125364432,0.9197041046581621,0.9356390619554776,0.8434554032416881,0.8563648740639891
|
| 18 |
+
17,0.3629737127773999,0.3471138860498156,0.8429643944317132,0.8637026239067055,0.9197510415418706,0.936129928856174,0.8429899031350723,0.8693221523410203
|
| 19 |
+
18,0.363794944664887,0.35338963598621137,0.8428874371515546,0.8527696793002916,0.919400791056018,0.937946773878231,0.8427179812021708,0.8620218579234973
|
| 20 |
+
19,0.3612006870932197,0.3544301845937012,0.8441957109142525,0.8483965014577259,0.9205795272438061,0.9371031627978137,0.8440797186400938,0.8583106267029973
|
| 21 |
+
20,0.35985265280425144,0.34653563688864863,0.8451876047474091,0.8629737609329446,0.9212270520689332,0.9376503412693691,0.8448732338854092,0.869625520110957
|
| 22 |
+
21,0.3582611175785762,0.3465235511346044,0.8451705031295961,0.8564139941690962,0.9218992112489598,0.9386681994747087,0.8448534388950295,0.8646048109965636
|
| 23 |
+
22,0.35836609831110994,0.3376696258348904,0.8452816636453808,0.8658892128279884,0.9218332174704624,0.9389146954075258,0.845069698941672,0.8711484593837535
|
| 24 |
+
23,0.357339754399447,0.3394383715123546,0.8458716694599309,0.8658892128279884,0.9223462758024424,0.9387999473008696,0.8455904398852101,0.8713286713286713
|
| 25 |
+
24,0.35804432434826455,0.344140861628702,0.8449738345247461,0.8637026239067055,0.9219979159587275,0.9382708310312879,0.8446922971491229,0.8702290076335878
|
| 26 |
+
25,0.3563122466666781,0.34014157038562154,0.8465728357902658,0.8658892128279884,0.922973728878079,0.9390262560667749,0.846171651963684,0.8715083798882681
|
| 27 |
+
26,0.35848609832751505,0.3428510632094419,0.8452474604097547,0.8622448979591837,0.9218336656735835,0.9383760167957229,0.8450699402468882,0.8688410825815406
|
| 28 |
+
27,0.3563769290565641,0.3381501745552085,0.8467353011594897,0.8673469387755102,0.9227351969318646,0.9387680728267983,0.8464569627192983,0.8725490196078431
|
| 29 |
+
28,0.35655961564249594,0.3393103123232505,0.845905872695557,0.8658892128279884,0.9226312293084508,0.9387829475813649,0.8456749419814512,0.8713286713286713
|
| 30 |
+
29,0.3574257842763429,0.33912095592598873,0.8468037076307419,0.8666180758017493,0.9223738257442348,0.9387500106248245,0.8466122155442544,0.8719384184744576
|
| 31 |
+
30,0.3562603021361586,0.3389132391259552,0.8461880493894722,0.8666180758017493,0.9229400096725773,0.9387276984929749,0.8459905135361907,0.8719384184744576
|
| 32 |
+
31,0.35629812248557524,0.34000008781345525,0.8464360228477614,0.8658892128279884,0.9229673904231751,0.9387000739487797,0.8461400066824876,0.8715083798882681
|
| 33 |
+
32,0.3569562033858007,0.3386197740934333,0.8463163115230701,0.8666180758017493,0.9226200321269892,0.9387914474411172,0.8462913391887384,0.8719384184744576
|
| 34 |
+
33,0.35738727149768645,0.3403620165569094,0.8466412422615179,0.8658892128279884,0.9223608182023694,0.9387330109053201,0.8463166553842726,0.8715083798882681
|
| 35 |
+
34,0.3566963535374173,0.3399086983836427,0.8461452953449397,0.8666180758017493,0.9228278941112513,0.9387478856598866,0.8463148184528131,0.8721174004192872
|
| 36 |
+
35,0.3577859981623435,0.3396764656545122,0.8452560112186613,0.8666180758017493,0.9222524587733002,0.9387659478618602,0.8453181759904269,0.8721174004192872
|
training_metrics.csv
ADDED
|
@@ -0,0 +1,36 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
epoch,train_loss,val_loss,train_accuracy,val_accuracy,train_auc,val_auc,train_f1,val_f1
|
| 2 |
+
1,0.5323386875501054,0.4755117762888138,0.7353011594896878,0.7711370262390671,0.81348677488557,0.885203019150184,0.7361674564483688,0.7931488801054019
|
| 3 |
+
2,0.43781517835478373,0.4284092179192746,0.8014929712350788,0.8192419825072886,0.8830454485545415,0.9047898834669228,0.8016473141431488,0.8299039780521262
|
| 4 |
+
3,0.41427445929300877,0.4113463647337766,0.815388035708178,0.8221574344023324,0.8951539429236054,0.9151671497420293,0.8154827020374675,0.8349120433017592
|
| 5 |
+
4,0.4023083928608962,0.40359135593339235,0.8228101378390396,0.8192419825072886,0.9012251468093784,0.9179381040212836,0.8228828336011488,0.8322056833558863
|
| 6 |
+
5,0.3954562406641524,0.38841467365926624,0.8263587235352464,0.8301749271137027,0.9046182985878689,0.9236914465911312,0.8263661470847263,0.8409556313993174
|
| 7 |
+
6,0.3872764221428153,0.3705469915540156,0.8305058658549098,0.8542274052478134,0.9087822170114311,0.9251704221880339,0.8304159608507434,0.8585572842998586
|
| 8 |
+
7,0.384070718126755,0.39311501870349963,0.8321219687382426,0.8163265306122449,0.9101693789104086,0.9269139559197275,0.8320286098063876,0.8328912466843501
|
| 9 |
+
8,0.3805712183671957,0.35730582684191603,0.8340116975065841,0.8651603498542274,0.9117416777986673,0.9309142024156603,0.8339662663792808,0.8687012065294535
|
| 10 |
+
9,0.3776779572743212,0.357522688349899,0.8343024250094059,0.8556851311953353,0.9130646675394666,0.932449489583422,0.8342825867583423,0.861731843575419
|
| 11 |
+
10,0.374850532579802,0.3752483752482834,0.8371327427574649,0.8345481049562682,0.9144322412803798,0.9314040068338874,0.8371090643199836,0.8469318948078219
|
| 12 |
+
11,0.3697774252474999,0.34939942624061515,0.8396039265314499,0.8680758017492711,0.9168827815351508,0.9343598330627545,0.8393401623899415,0.8722653493295696
|
| 13 |
+
12,0.3697715133579435,0.3443179043666962,0.8393217498375346,0.8658892128279884,0.9166875088673295,0.936127803891236,0.8393066351967299,0.8702397743300423
|
| 14 |
+
13,0.36852555068119747,0.3601747573551562,0.8397150870472346,0.8513119533527697,0.9173666724458207,0.9336883441423217,0.8394831262470136,0.859504132231405
|
| 15 |
+
14,0.3676681494243527,0.35326992744259517,0.8404504566131956,0.8564139941690962,0.9176733040671543,0.9359365570468088,0.8402742704525805,0.8632893823733518
|
| 16 |
+
15,0.3650746907724517,0.3343106554739677,0.8426907685467045,0.8731778425655977,0.9190811037822717,0.9378840874125578,0.8424954838488737,0.8760683760683761
|
| 17 |
+
16,0.3633183905273226,0.36100645167834555,0.8436655607620481,0.8462099125364432,0.9197041046581621,0.9356390619554776,0.8434554032416881,0.8563648740639891
|
| 18 |
+
17,0.3629737127773999,0.3471138860498156,0.8429643944317132,0.8637026239067055,0.9197510415418706,0.936129928856174,0.8429899031350723,0.8693221523410203
|
| 19 |
+
18,0.363794944664887,0.35338963598621137,0.8428874371515546,0.8527696793002916,0.919400791056018,0.937946773878231,0.8427179812021708,0.8620218579234973
|
| 20 |
+
19,0.3612006870932197,0.3544301845937012,0.8441957109142525,0.8483965014577259,0.9205795272438061,0.9371031627978137,0.8440797186400938,0.8583106267029973
|
| 21 |
+
20,0.35985265280425144,0.34653563688864863,0.8451876047474091,0.8629737609329446,0.9212270520689332,0.9376503412693691,0.8448732338854092,0.869625520110957
|
| 22 |
+
21,0.3582611175785762,0.3465235511346044,0.8451705031295961,0.8564139941690962,0.9218992112489598,0.9386681994747087,0.8448534388950295,0.8646048109965636
|
| 23 |
+
22,0.35836609831110994,0.3376696258348904,0.8452816636453808,0.8658892128279884,0.9218332174704624,0.9389146954075258,0.845069698941672,0.8711484593837535
|
| 24 |
+
23,0.357339754399447,0.3394383715123546,0.8458716694599309,0.8658892128279884,0.9223462758024424,0.9387999473008696,0.8455904398852101,0.8713286713286713
|
| 25 |
+
24,0.35804432434826455,0.344140861628702,0.8449738345247461,0.8637026239067055,0.9219979159587275,0.9382708310312879,0.8446922971491229,0.8702290076335878
|
| 26 |
+
25,0.3563122466666781,0.34014157038562154,0.8465728357902658,0.8658892128279884,0.922973728878079,0.9390262560667749,0.846171651963684,0.8715083798882681
|
| 27 |
+
26,0.35848609832751505,0.3428510632094419,0.8452474604097547,0.8622448979591837,0.9218336656735835,0.9383760167957229,0.8450699402468882,0.8688410825815406
|
| 28 |
+
27,0.3563769290565641,0.3381501745552085,0.8467353011594897,0.8673469387755102,0.9227351969318646,0.9387680728267983,0.8464569627192983,0.8725490196078431
|
| 29 |
+
28,0.35655961564249594,0.3393103123232505,0.845905872695557,0.8658892128279884,0.9226312293084508,0.9387829475813649,0.8456749419814512,0.8713286713286713
|
| 30 |
+
29,0.3574257842763429,0.33912095592598873,0.8468037076307419,0.8666180758017493,0.9223738257442348,0.9387500106248245,0.8466122155442544,0.8719384184744576
|
| 31 |
+
30,0.3562603021361586,0.3389132391259552,0.8461880493894722,0.8666180758017493,0.9229400096725773,0.9387276984929749,0.8459905135361907,0.8719384184744576
|
| 32 |
+
31,0.35629812248557524,0.34000008781345525,0.8464360228477614,0.8658892128279884,0.9229673904231751,0.9387000739487797,0.8461400066824876,0.8715083798882681
|
| 33 |
+
32,0.3569562033858007,0.3386197740934333,0.8463163115230701,0.8666180758017493,0.9226200321269892,0.9387914474411172,0.8462913391887384,0.8719384184744576
|
| 34 |
+
33,0.35738727149768645,0.3403620165569094,0.8466412422615179,0.8658892128279884,0.9223608182023694,0.9387330109053201,0.8463166553842726,0.8715083798882681
|
| 35 |
+
34,0.3566963535374173,0.3399086983836427,0.8461452953449397,0.8666180758017493,0.9228278941112513,0.9387478856598866,0.8463148184528131,0.8721174004192872
|
| 36 |
+
35,0.3577859981623435,0.3396764656545122,0.8452560112186613,0.8666180758017493,0.9222524587733002,0.9387659478618602,0.8453181759904269,0.8721174004192872
|
training_notebook_c1.ipynb
ADDED
|
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:f61fa8d16483db850ff07e750ecef613320fea004eca79681f0f0da3854f1df8
|
| 3 |
+
size 26069037
|