Upload README.md with huggingface_hub
Browse files
    	
        README.md
    CHANGED
    
    | @@ -10,17 +10,4 @@ If you haven't already, you can install the [Transformers.js](https://huggingfac | |
| 10 | 
             
            npm i @huggingface/transformers
         | 
| 11 | 
             
            ```
         | 
| 12 |  | 
| 13 | 
            -
            ```js
         | 
| 14 | 
            -
            import { pipeline } from '@huggingface/transformers';
         | 
| 15 | 
            -
             | 
| 16 | 
            -
            // Create the pipeline
         | 
| 17 | 
            -
            const transcriber = await pipeline('automatic-speech-recognition', 'Xenova/whisper-large-v3', {
         | 
| 18 | 
            -
                dtype: 'fp32',  // Options: "fp32", "fp16", "q8", "q4"
         | 
| 19 | 
            -
            });
         | 
| 20 | 
            -
             | 
| 21 | 
            -
            // Use the model
         | 
| 22 | 
            -
            const url = 'https://huggingface.co/datasets/Xenova/transformers.js-docs/resolve/main/jfk.wav';
         | 
| 23 | 
            -
            const output = await transcriber(url);
         | 
| 24 | 
            -
            ```
         | 
| 25 | 
            -
             | 
| 26 | 
             
            Note: Having a separate repo for ONNX weights is intended to be a temporary solution until WebML gains more traction. If you would like to make your models web-ready, we recommend converting to ONNX using [🤗 Optimum](https://huggingface.co/docs/optimum/index) and structuring your repo like this one (with ONNX weights located in a subfolder named `onnx`).
         | 
|  | |
| 10 | 
             
            npm i @huggingface/transformers
         | 
| 11 | 
             
            ```
         | 
| 12 |  | 
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
| 13 | 
             
            Note: Having a separate repo for ONNX weights is intended to be a temporary solution until WebML gains more traction. If you would like to make your models web-ready, we recommend converting to ONNX using [🤗 Optimum](https://huggingface.co/docs/optimum/index) and structuring your repo like this one (with ONNX weights located in a subfolder named `onnx`).
         | 

