Spaces:
Running
on
CPU Upgrade
Running
on
CPU Upgrade
Commit
·
ded6a78
1
Parent(s):
1433ab1
Fixed description wording for clarity and typos
Browse filesAlso added a direct link to ONNX website in description.
app.py
CHANGED
|
@@ -91,17 +91,17 @@ TITLE = """
|
|
| 91 |
|
| 92 |
# for some reason https://huggingface.co/settings/tokens is not showing as a link by default?
|
| 93 |
DESCRIPTION = """
|
| 94 |
-
This Space allows to automatically convert
|
| 95 |
to merge the PR to allow people to leverage the ONNX standard to share and use the model on a wide range of devices!
|
| 96 |
|
| 97 |
-
Once converted, the model can for example be used in the [🤗 Optimum](https://huggingface.co/docs/optimum/) library following
|
| 98 |
Check out [this guide](https://huggingface.co/docs/optimum/main/en/onnxruntime/usage_guides/models) to see how!
|
| 99 |
|
| 100 |
-
The steps are
|
| 101 |
- Paste a read-access token from [https://huggingface.co/settings/tokens](https://huggingface.co/settings/tokens). Read access is enough given that we will open a PR against the source repo.
|
| 102 |
- Input a model id from the Hub (for example: [textattack/distilbert-base-cased-CoLA](https://huggingface.co/textattack/distilbert-base-cased-CoLA))
|
| 103 |
- Click "Convert to ONNX"
|
| 104 |
-
- That's it! You'll get feedback if
|
| 105 |
|
| 106 |
Note: in case the model to convert is larger than 2 GB, it will be saved in a subfolder called `onnx/`. To load it from Optimum, the argument `subfolder="onnx"` should be provided.
|
| 107 |
"""
|
|
|
|
| 91 |
|
| 92 |
# for some reason https://huggingface.co/settings/tokens is not showing as a link by default?
|
| 93 |
DESCRIPTION = """
|
| 94 |
+
This Space allows you to automatically convert 🤗 transformers PyTorch models hosted on the Hugging Face Hub to [ONNX](https://onnx.ai/). It opens a PR on the target model, and it is up to the owner of the original model
|
| 95 |
to merge the PR to allow people to leverage the ONNX standard to share and use the model on a wide range of devices!
|
| 96 |
|
| 97 |
+
Once converted, the model can, for example, be used in the [🤗 Optimum](https://huggingface.co/docs/optimum/) library closely following the transformers API.
|
| 98 |
Check out [this guide](https://huggingface.co/docs/optimum/main/en/onnxruntime/usage_guides/models) to see how!
|
| 99 |
|
| 100 |
+
The steps are as following:
|
| 101 |
- Paste a read-access token from [https://huggingface.co/settings/tokens](https://huggingface.co/settings/tokens). Read access is enough given that we will open a PR against the source repo.
|
| 102 |
- Input a model id from the Hub (for example: [textattack/distilbert-base-cased-CoLA](https://huggingface.co/textattack/distilbert-base-cased-CoLA))
|
| 103 |
- Click "Convert to ONNX"
|
| 104 |
+
- That's it! You'll get feedback on if the conversion was successful or not, and if it was, you'll get the URL of the opened PR!
|
| 105 |
|
| 106 |
Note: in case the model to convert is larger than 2 GB, it will be saved in a subfolder called `onnx/`. To load it from Optimum, the argument `subfolder="onnx"` should be provided.
|
| 107 |
"""
|