mteb/amazon_massive_intent
Viewer • Updated • 843k • 12.2k • 27
How to use mahwizzzz/UrduIntentClassification with Transformers:
# Use a pipeline as a high-level helper
from transformers import pipeline
pipe = pipeline("text-classification", model="mahwizzzz/UrduIntentClassification") # Load model directly
from transformers import AutoTokenizer, AutoModelForSequenceClassification
tokenizer = AutoTokenizer.from_pretrained("mahwizzzz/UrduIntentClassification")
model = AutoModelForSequenceClassification.from_pretrained("mahwizzzz/UrduIntentClassification")This model is a fine-tuned version of urduhack/roberta-urdu-small on the massive dataset. It achieves the following results on the evaluation set:
More information needed
More information needed
More information needed
The following hyperparameters were used during training:
| Training Loss | Epoch | Step | Validation Loss |
|---|---|---|---|
| 1.3314 | 1.0 | 720 | 1.0549 |
| 0.7468 | 2.0 | 1440 | 0.7997 |
| 0.4117 | 3.0 | 2160 | 0.7482 |
| 0.3267 | 4.0 | 2880 | 0.8247 |
| 0.2292 | 5.0 | 3600 | 0.9014 |
| 0.0356 | 6.0 | 4320 | 0.9446 |
| 0.0123 | 7.0 | 5040 | 0.9757 |
| 0.0208 | 8.0 | 5760 | 0.9854 |
Base model
urduhack/roberta-urdu-small