File size: 1,487 Bytes
1c56401 02eaa8d 1c56401 02eaa8d 1c56401 02eaa8d |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 |
---
library_name: transformers
tags:
- transformers.js
- tokenizers
---
# GPT-3.5-turbo Tokenizer
A 🤗-compatible version of the **GPT-3.5-turbo tokenizer** (adapted from [openai/tiktoken](https://github.com/openai/tiktoken)). This means it can be used with Hugging Face libraries including [Transformers](https://github.com/huggingface/transformers), [Tokenizers](https://github.com/huggingface/tokenizers), and [Transformers.js](https://github.com/huggingface/transformers.js).
## Usage (Transformers.js)
If you haven't already, you can install the [Transformers.js](https://huggingface.co/docs/transformers.js) JavaScript library from [NPM](https://www.npmjs.com/package/@huggingface/transformers) using:
```bash
npm i @huggingface/transformers
```
**Example:** Tokenize text using Transformers.js
```js
import { AutoTokenizer } from '@huggingface/transformers';
const tokenizer = await AutoTokenizer.from_pretrained('Xenova/gpt-3.5-turbo');
const tokens = tokenizer.encode('hello world'); // [15339, 1917]
```
## Example usage:
### Transformers/Tokenizers
```py
from transformers import GPT2TokenizerFast
tokenizer = GPT2TokenizerFast.from_pretrained('Xenova/gpt-3.5-turbo')
assert tokenizer.encode('hello world') == [15339, 1917]
```
### Transformers.js
```js
import { AutoTokenizer } from '@huggingface/transformers';
const tokenizer = await AutoTokenizer.from_pretrained('Xenova/gpt-3.5-turbo');
const tokens = tokenizer.encode('hello world'); // [15339, 1917]
``` |