Instructions to use autonomous019/bert_small_uncased_512 with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use autonomous019/bert_small_uncased_512 with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-classification", model="autonomous019/bert_small_uncased_512")# Load model directly from transformers import AutoTokenizer, AutoModelForSequenceClassification tokenizer = AutoTokenizer.from_pretrained("autonomous019/bert_small_uncased_512") model = AutoModelForSequenceClassification.from_pretrained("autonomous019/bert_small_uncased_512") - Notebooks
- Google Colab
- Kaggle
- Xet hash:
- c9898dab5a2f8f1866b878b1da9683c4d74f02dbd50e31a23e87607b11954956
- Size of remote file:
- 3.38 kB
- SHA256:
- 86150c722997353a0ad60c3e520dbc2ad0d0ee0c0e6a060cbaec5a42180f1959
·
Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.