Spaces:
Running
on
CPU Upgrade
Running
on
CPU Upgrade
Felix Marty
commited on
Commit
Β·
04e8a16
1
Parent(s):
c9b76af
rename to export
Browse files
README.md
CHANGED
|
@@ -1,6 +1,6 @@
|
|
| 1 |
---
|
| 2 |
-
title:
|
| 3 |
-
emoji:
|
| 4 |
colorFrom: green
|
| 5 |
colorTo: purple
|
| 6 |
sdk: gradio
|
|
|
|
| 1 |
---
|
| 2 |
+
title: Export to ONNX
|
| 3 |
+
emoji: ποΈ
|
| 4 |
colorFrom: green
|
| 5 |
colorTo: purple
|
| 6 |
sdk: gradio
|
app.py
CHANGED
|
@@ -59,7 +59,7 @@ def onnx_export(token: str, model_id: str, task: str, opset: Union[int, str]) ->
|
|
| 59 |
commit_url = repo.push_to_hub()
|
| 60 |
print("[dataset]", commit_url)
|
| 61 |
|
| 62 |
-
return f"#### Success π₯ Yay! This model was successfully
|
| 63 |
except Exception as e:
|
| 64 |
return f"#### Error: {e}"
|
| 65 |
|
|
@@ -89,26 +89,26 @@ TITLE = """
|
|
| 89 |
"
|
| 90 |
>
|
| 91 |
<h1 style="font-weight: 900; margin-bottom: 10px; margin-top: 10px;">
|
| 92 |
-
|
| 93 |
</h1>
|
| 94 |
</div>
|
| 95 |
"""
|
| 96 |
|
| 97 |
# for some reason https://huggingface.co/settings/tokens is not showing as a link by default?
|
| 98 |
DESCRIPTION = """
|
| 99 |
-
This Space allows you to automatically
|
| 100 |
to merge the PR to allow people to leverage the ONNX standard to share and use the model on a wide range of devices!
|
| 101 |
|
| 102 |
-
Once
|
| 103 |
Check out [this guide](https://huggingface.co/docs/optimum/main/en/onnxruntime/usage_guides/models) to see how!
|
| 104 |
|
| 105 |
The steps are as following:
|
| 106 |
- Paste a read-access token from [https://huggingface.co/settings/tokens](https://huggingface.co/settings/tokens). Read access is enough given that we will open a PR against the source repo.
|
| 107 |
- Input a model id from the Hub (for example: [textattack/distilbert-base-cased-CoLA](https://huggingface.co/textattack/distilbert-base-cased-CoLA))
|
| 108 |
-
- Click "
|
| 109 |
-
- That's it! You'll get feedback on if the
|
| 110 |
|
| 111 |
-
Note: in case the model to
|
| 112 |
"""
|
| 113 |
|
| 114 |
with gr.Blocks() as demo:
|
|
@@ -140,7 +140,7 @@ with gr.Blocks() as demo:
|
|
| 140 |
label="ONNX opset (optional, can be left blank)",
|
| 141 |
)
|
| 142 |
|
| 143 |
-
btn = gr.Button("
|
| 144 |
output = gr.Markdown(label="Output")
|
| 145 |
|
| 146 |
btn.click(
|
|
|
|
| 59 |
commit_url = repo.push_to_hub()
|
| 60 |
print("[dataset]", commit_url)
|
| 61 |
|
| 62 |
+
return f"#### Success π₯ Yay! This model was successfully exported and a PR was open using your token, here: [{commit_info.pr_url}]({commit_info.pr_url})"
|
| 63 |
except Exception as e:
|
| 64 |
return f"#### Error: {e}"
|
| 65 |
|
|
|
|
| 89 |
"
|
| 90 |
>
|
| 91 |
<h1 style="font-weight: 900; margin-bottom: 10px; margin-top: 10px;">
|
| 92 |
+
Export transformers model to ONNX with π€ Optimum exporters ποΈ (Beta)
|
| 93 |
</h1>
|
| 94 |
</div>
|
| 95 |
"""
|
| 96 |
|
| 97 |
# for some reason https://huggingface.co/settings/tokens is not showing as a link by default?
|
| 98 |
DESCRIPTION = """
|
| 99 |
+
This Space allows you to automatically export π€ transformers PyTorch models hosted on the Hugging Face Hub to [ONNX](https://onnx.ai/). It opens a PR on the target model, and it is up to the owner of the original model
|
| 100 |
to merge the PR to allow people to leverage the ONNX standard to share and use the model on a wide range of devices!
|
| 101 |
|
| 102 |
+
Once exported, the model can, for example, be used in the [π€ Optimum](https://huggingface.co/docs/optimum/) library closely following the transformers API.
|
| 103 |
Check out [this guide](https://huggingface.co/docs/optimum/main/en/onnxruntime/usage_guides/models) to see how!
|
| 104 |
|
| 105 |
The steps are as following:
|
| 106 |
- Paste a read-access token from [https://huggingface.co/settings/tokens](https://huggingface.co/settings/tokens). Read access is enough given that we will open a PR against the source repo.
|
| 107 |
- Input a model id from the Hub (for example: [textattack/distilbert-base-cased-CoLA](https://huggingface.co/textattack/distilbert-base-cased-CoLA))
|
| 108 |
+
- Click "Export to ONNX"
|
| 109 |
+
- That's it! You'll get feedback on if the export was successful or not, and if it was, you'll get the URL of the opened PR!
|
| 110 |
|
| 111 |
+
Note: in case the model to export is larger than 2 GB, it will be saved in a subfolder called `onnx/`. To load it from Optimum, the argument `subfolder="onnx"` should be provided.
|
| 112 |
"""
|
| 113 |
|
| 114 |
with gr.Blocks() as demo:
|
|
|
|
| 140 |
label="ONNX opset (optional, can be left blank)",
|
| 141 |
)
|
| 142 |
|
| 143 |
+
btn = gr.Button("Export to ONNX")
|
| 144 |
output = gr.Markdown(label="Output")
|
| 145 |
|
| 146 |
btn.click(
|