tarekmasryo commited on
Commit
17c12e4
·
verified ·
1 Parent(s): 237e7a8

Upload 5 files

Browse files
Files changed (6) hide show
  1. .gitattributes +1 -0
  2. LICENSE.txt +201 -0
  3. README.md +151 -5
  4. app.py +977 -0
  5. assets/Example.png +3 -0
  6. requirements.txt +2 -0
.gitattributes CHANGED
@@ -33,3 +33,4 @@ saved_model/**/* filter=lfs diff=lfs merge=lfs -text
33
  *.zip filter=lfs diff=lfs merge=lfs -text
34
  *.zst filter=lfs diff=lfs merge=lfs -text
35
  *tfevents* filter=lfs diff=lfs merge=lfs -text
 
 
33
  *.zip filter=lfs diff=lfs merge=lfs -text
34
  *.zst filter=lfs diff=lfs merge=lfs -text
35
  *tfevents* filter=lfs diff=lfs merge=lfs -text
36
+ assets/Example.png filter=lfs diff=lfs merge=lfs -text
LICENSE.txt ADDED
@@ -0,0 +1,201 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ Apache License
2
+ Version 2.0, January 2004
3
+ http://www.apache.org/licenses/
4
+
5
+ TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
6
+
7
+ 1. Definitions.
8
+
9
+ "License" shall mean the terms and conditions for use, reproduction,
10
+ and distribution as defined by Sections 1 through 9 of this document.
11
+
12
+ "Licensor" shall mean the copyright owner or entity authorized by
13
+ the copyright owner that is granting the License.
14
+
15
+ "Legal Entity" shall mean the union of the acting entity and all
16
+ other entities that control, are controlled by, or are under common
17
+ control with that entity. For the purposes of this definition,
18
+ "control" means (i) the power, direct or indirect, to cause the
19
+ direction or management of such entity, whether by contract or
20
+ otherwise, or (ii) ownership of fifty percent (50%) or more of the
21
+ outstanding shares, or (iii) beneficial ownership of such entity.
22
+
23
+ "You" (or "Your") shall mean an individual or Legal Entity
24
+ exercising permissions granted by this License.
25
+
26
+ "Source" form shall mean the preferred form for making modifications,
27
+ including but not limited to software source code, documentation
28
+ source, and configuration files.
29
+
30
+ "Object" form shall mean any form resulting from mechanical
31
+ transformation or translation of a Source form, including but
32
+ not limited to compiled object code, generated documentation,
33
+ and conversions to other media types.
34
+
35
+ "Work" shall mean the work of authorship, whether in Source or
36
+ Object form, made available under the License, as indicated by a
37
+ copyright notice that is included in or attached to the work
38
+ (an example is provided in the Appendix below).
39
+
40
+ "Derivative Works" shall mean any work, whether in Source or Object
41
+ form, that is based on (or derived from) the Work and for which the
42
+ editorial revisions, annotations, elaborations, or other modifications
43
+ represent, as a whole, an original work of authorship. For the purposes
44
+ of this License, Derivative Works shall not include works that remain
45
+ separable from, or merely link (or bind by name) to the interfaces of,
46
+ the Work and Derivative Works thereof.
47
+
48
+ "Contribution" shall mean any work of authorship, including
49
+ the original version of the Work and any modifications or additions
50
+ to that Work or Derivative Works thereof, that is intentionally
51
+ submitted to Licensor for inclusion in the Work by the copyright owner
52
+ or by an individual or Legal Entity authorized to submit on behalf of
53
+ the copyright owner. For the purposes of this definition, "submitted"
54
+ means any form of electronic, verbal, or written communication sent
55
+ to the Licensor or its representatives, including but not limited to
56
+ communication on electronic mailing lists, source code control systems,
57
+ and issue tracking systems that are managed by, or on behalf of, the
58
+ Licensor for the purpose of discussing and improving the Work, but
59
+ excluding communication that is conspicuously marked or otherwise
60
+ designated in writing by the copyright owner as "Not a Contribution."
61
+
62
+ "Contributor" shall mean Licensor and any individual or Legal Entity
63
+ on behalf of whom a Contribution has been received by Licensor and
64
+ subsequently incorporated within the Work.
65
+
66
+ 2. Grant of Copyright License. Subject to the terms and conditions of
67
+ this License, each Contributor hereby grants to You a perpetual,
68
+ worldwide, non-exclusive, no-charge, royalty-free, irrevocable
69
+ copyright license to reproduce, prepare Derivative Works of,
70
+ publicly display, publicly perform, sublicense, and distribute the
71
+ Work and such Derivative Works in Source or Object form.
72
+
73
+ 3. Grant of Patent License. Subject to the terms and conditions of
74
+ this License, each Contributor hereby grants to You a perpetual,
75
+ worldwide, non-exclusive, no-charge, royalty-free, irrevocable
76
+ (except as stated in this section) patent license to make, have made,
77
+ use, offer to sell, sell, import, and otherwise transfer the Work,
78
+ where such license applies only to those patent claims licensable
79
+ by such Contributor that are necessarily infringed by their
80
+ Contribution(s) alone or by combination of their Contribution(s)
81
+ with the Work to which such Contribution(s) was submitted. If You
82
+ institute patent litigation against any entity (including a
83
+ cross-claim or counterclaim in a lawsuit) alleging that the Work
84
+ or a Contribution incorporated within the Work constitutes direct
85
+ or contributory patent infringement, then any patent licenses
86
+ granted to You under this License for that Work shall terminate
87
+ as of the date such litigation is filed.
88
+
89
+ 4. Redistribution. You may reproduce and distribute copies of the
90
+ Work or Derivative Works thereof in any medium, with or without
91
+ modifications, and in Source or Object form, provided that You
92
+ meet the following conditions:
93
+
94
+ (a) You must give any other recipients of the Work or
95
+ Derivative Works a copy of this License; and
96
+
97
+ (b) You must cause any modified files to carry prominent notices
98
+ stating that You changed the files; and
99
+
100
+ (c) You must retain, in the Source form of any Derivative Works
101
+ that You distribute, all copyright, patent, trademark, and
102
+ attribution notices from the Source form of the Work,
103
+ excluding those notices that do not pertain to any part of
104
+ the Derivative Works; and
105
+
106
+ (d) If the Work includes a "NOTICE" text file as part of its
107
+ distribution, then any Derivative Works that You distribute must
108
+ include a readable copy of the attribution notices contained
109
+ within such NOTICE file, excluding those notices that do not
110
+ pertain to any part of the Derivative Works, in at least one
111
+ of the following places: within a NOTICE text file distributed
112
+ as part of the Derivative Works; within the Source form or
113
+ documentation, if provided along with the Derivative Works; or,
114
+ within a display generated by the Derivative Works, if and
115
+ wherever such third-party notices normally appear. The contents
116
+ of the NOTICE file are for informational purposes only and
117
+ do not modify the License. You may add Your own attribution
118
+ notices within Derivative Works that You distribute, alongside
119
+ or as an addendum to the NOTICE text from the Work, provided
120
+ that such additional attribution notices cannot be construed
121
+ as modifying the License.
122
+
123
+ You may add Your own copyright statement to Your modifications and
124
+ may provide additional or different license terms and conditions
125
+ for use, reproduction, or distribution of Your modifications, or
126
+ for any such Derivative Works as a whole, provided Your use,
127
+ reproduction, and distribution of the Work otherwise complies with
128
+ the conditions stated in this License.
129
+
130
+ 5. Submission of Contributions. Unless You explicitly state otherwise,
131
+ any Contribution intentionally submitted for inclusion in the Work
132
+ by You to the Licensor shall be under the terms and conditions of
133
+ this License, without any additional terms or conditions.
134
+ Notwithstanding the above, nothing herein shall supersede or modify
135
+ the terms of any separate license agreement you may have executed
136
+ with Licensor regarding such Contributions.
137
+
138
+ 6. Trademarks. This License does not grant permission to use the trade
139
+ names, trademarks, service marks, or product names of the Licensor,
140
+ except as required for reasonable and customary use in describing the
141
+ origin of the Work and reproducing the content of the NOTICE file.
142
+
143
+ 7. Disclaimer of Warranty. Unless required by applicable law or
144
+ agreed to in writing, Licensor provides the Work (and each
145
+ Contributor provides its Contributions) on an "AS IS" BASIS,
146
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
147
+ implied, including, without limitation, any warranties or conditions
148
+ of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
149
+ PARTICULAR PURPOSE. You are solely responsible for determining the
150
+ appropriateness of using or redistributing the Work and assume any
151
+ risks associated with Your exercise of permissions under this License.
152
+
153
+ 8. Limitation of Liability. In no event and under no legal theory,
154
+ whether in tort (including negligence), contract, or otherwise,
155
+ unless required by applicable law (such as deliberate and grossly
156
+ negligent acts) or agreed to in writing, shall any Contributor be
157
+ liable to You for damages, including any direct, indirect, special,
158
+ incidental, or consequential damages of any character arising as a
159
+ result of this License or out of the use or inability to use the
160
+ Work (including but not limited to damages for loss of goodwill,
161
+ work stoppage, computer failure or malfunction, or any and all
162
+ other commercial damages or losses), even if such Contributor
163
+ has been advised of the possibility of such damages.
164
+
165
+ 9. Accepting Warranty or Additional Liability. While redistributing
166
+ the Work or Derivative Works thereof, You may choose to offer,
167
+ and charge a fee for, acceptance of support, warranty, indemnity,
168
+ or other liability obligations and/or rights consistent with this
169
+ License. However, in accepting such obligations, You may act only
170
+ on Your own behalf and on Your sole responsibility, not on behalf
171
+ of any other Contributor, and only if You agree to indemnify,
172
+ defend, and hold each Contributor harmless for any liability
173
+ incurred by, or claims asserted against, such Contributor by reason
174
+ of your accepting any such warranty or additional liability.
175
+
176
+ END OF TERMS AND CONDITIONS
177
+
178
+ APPENDIX: How to apply the Apache License to your work.
179
+
180
+ To apply the Apache License to your work, attach the following
181
+ boilerplate notice, with the fields enclosed by brackets "[]"
182
+ replaced with your own identifying information. (Don't include
183
+ the brackets!) The text should be enclosed in the appropriate
184
+ comment syntax for the file format. We also recommend that a
185
+ file or class name and description of purpose be included on the
186
+ same "printed page" as the copyright notice for easier
187
+ identification within third-party archives.
188
+
189
+ Copyright [yyyy] [name of copyright owner]
190
+
191
+ Licensed under the Apache License, Version 2.0 (the "License");
192
+ you may not use this file except in compliance with the License.
193
+ You may obtain a copy of the License at
194
+
195
+ http://www.apache.org/licenses/LICENSE-2.0
196
+
197
+ Unless required by applicable law or agreed to in writing, software
198
+ distributed under the License is distributed on an "AS IS" BASIS,
199
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
200
+ See the License for the specific language governing permissions and
201
+ limitations under the License.
README.md CHANGED
@@ -1,13 +1,159 @@
1
  ---
2
  title: QuickStart
3
- emoji: 📊
4
- colorFrom: pink
5
- colorTo: red
6
  sdk: gradio
7
  sdk_version: 6.2.0
 
8
  app_file: app.py
9
  pinned: false
10
- short_description: 'Paste a Hugging Face URL or Repo ID '
 
11
  ---
12
 
13
- Check out the configuration reference at https://huggingface.co/docs/hub/spaces-config-reference
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
2
  title: QuickStart
3
+ emoji: 🚀
4
+ colorFrom: indigo
5
+ colorTo: green
6
  sdk: gradio
7
  sdk_version: 6.2.0
8
+ python_version: 3.11
9
  app_file: app.py
10
  pinned: false
11
+ license: apache-2.0
12
+ short_description: Generate reliable run/download snippets for any HF repo
13
  ---
14
 
15
+ # QuickStart
16
+
17
+ A Gradio Space that converts any Hugging Face **URL** or **Repo ID** into clean, copy-ready **first-run artifacts**:
18
+
19
+ - **Run snippet** (best-effort)
20
+ - **Download recipes** (Python + CLI)
21
+ - **Files view** + lightweight **risk hints** *(filename-based only)*
22
+ - **Exportable zip** with runnable scripts
23
+
24
+ [![UI](https://img.shields.io/badge/UI-Gradio-FF7A18)](https://www.gradio.app/)
25
+ ![Python](https://img.shields.io/badge/Python-3.11-blue)
26
+ ![License](https://img.shields.io/badge/License-Apache--2.0-orange)
27
+
28
+ ---
29
+
30
+ ## Live
31
+ - Space: https://huggingface.co/spaces/tarekmasryo/QuickStart
32
+
33
+ ## Preview
34
+ ![QuickStart UI](assets/Example.png)
35
+
36
+ ---
37
+
38
+ ## What problem does it solve?
39
+ Hugging Face repos are consistent to browse, but **starting them is not**:
40
+ - Different repo types (Model / Dataset / Space)
41
+ - Different download flows (Python vs CLI)
42
+ - Some repos are private/gated (token required)
43
+ - Large repos need reproducible “download + run” recipes
44
+
45
+ QuickStart standardizes the **first 5 minutes** into a repeatable workflow.
46
+
47
+ ---
48
+
49
+ ## Inputs
50
+ Accepted formats:
51
+
52
+ **Repo ID**
53
+ ```text
54
+ <owner>/<repo>
55
+ ```
56
+
57
+ **URLs**
58
+ ```text
59
+ https://huggingface.co/<owner>/<repo>
60
+ https://huggingface.co/datasets/<owner>/<repo>
61
+ https://huggingface.co/spaces/<owner>/<repo>
62
+ ```
63
+
64
+ Also works:
65
+ ```text
66
+ datasets/<owner>/<repo>
67
+ spaces/<owner>/<repo>
68
+ ```
69
+
70
+ ---
71
+
72
+ ## Outputs (Tabs)
73
+
74
+ ### 1) QuickStart
75
+ - Minimal Python snippet (best-effort)
76
+ - Install command
77
+
78
+ ### 2) Download
79
+ - Python recipe using `snapshot_download()`
80
+ - CLI recipe using `huggingface-cli download`
81
+
82
+ ### 3) Files
83
+ - Best-effort file list (limited)
84
+ - Risk hints summary (filename-based)
85
+
86
+ ### 4) Export
87
+ Creates a zip with a runnable, minimal scaffold:
88
+ - `run.py` (best-effort entry script)
89
+ - `download.py` (reproducible snapshot download)
90
+ - `requirements.txt`
91
+ - `.env.example`
92
+ - `README.md` (how to run locally)
93
+
94
+ ### 5) Badge
95
+ - Markdown badge linking to the target repo
96
+
97
+ ### 6) Details
98
+ - Raw repo metadata (JSON)
99
+
100
+ ---
101
+
102
+ ## Risk hints (non-audit)
103
+ **Important:** Risk hints are **filename-based only**:
104
+ - ✅ Flags names like `.env`, `token`, `api_key`, `credentials`, private keys
105
+ - ✅ Highlights common ML artifacts by extension (e.g., `.safetensors`, `.bin`, `.onnx`, `.gguf`)
106
+ - ❌ Does **not** scan file contents
107
+ - ❌ Not a security/compliance audit
108
+
109
+ ---
110
+
111
+ ## Authentication (private / gated repos)
112
+ Set `HF_TOKEN`.
113
+
114
+ **On Hugging Face Spaces**
115
+ - Settings → **Secrets**
116
+ - Add: `HF_TOKEN` = your token
117
+
118
+ **Locally**
119
+ Windows (PowerShell):
120
+ ```bash
121
+ setx HF_TOKEN "YOUR_TOKEN"
122
+ ```
123
+ Restart terminal.
124
+
125
+ macOS/Linux:
126
+ ```bash
127
+ export HF_TOKEN="YOUR_TOKEN"
128
+ ```
129
+
130
+ ---
131
+
132
+ ## Run locally
133
+ ```bash
134
+ git clone https://huggingface.co/spaces/tarekmasryo/QuickStart
135
+ cd QuickStart
136
+
137
+ python -m venv .venv
138
+ # Windows:
139
+ .venv\Scripts\activate
140
+ # macOS/Linux:
141
+ source .venv/bin/activate
142
+
143
+ pip install -U pip
144
+ pip install -r requirements.txt
145
+
146
+ python app.py
147
+ ```
148
+
149
+ ---
150
+
151
+ ## Known limitations (by design)
152
+ - Snippets are **best-effort** and depend on available Hub metadata.
153
+ - Files view is limited and may be incomplete for some repos.
154
+ - No content scanning (only filename signals).
155
+
156
+ ---
157
+
158
+ ## License
159
+ Apache-2.0
app.py ADDED
@@ -0,0 +1,977 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import os
2
+ import re
3
+ import html
4
+ import textwrap
5
+ import tempfile
6
+ import zipfile
7
+ import inspect
8
+ from functools import lru_cache
9
+ from typing import Any, Dict, Optional, Tuple, List
10
+
11
+ import gradio as gr
12
+ from huggingface_hub import HfApi
13
+ from huggingface_hub.utils import HfHubHTTPError
14
+
15
+
16
+ RE_REPO_ID = re.compile(r"^[A-Za-z0-9][A-Za-z0-9_.-]*/[A-Za-z0-9][A-Za-z0-9_.-]*$")
17
+
18
+
19
+ def esc(x: Any) -> str:
20
+ return html.escape("" if x is None else str(x), quote=True)
21
+
22
+
23
+ def norm_type(x: str) -> str:
24
+ x = (x or "model").strip().lower()
25
+ return x if x in {"model", "dataset", "space"} else "model"
26
+
27
+
28
+ def norm_id(x: str) -> str:
29
+ return (x or "").strip().strip("/")
30
+
31
+
32
+ def is_valid_repo_id(repo_id: str) -> bool:
33
+ return bool(RE_REPO_ID.match(norm_id(repo_id)))
34
+
35
+
36
+ def human_bytes(n: Optional[int]) -> str:
37
+ if not isinstance(n, int) or n <= 0:
38
+ return "N/A"
39
+ units = ["B", "KB", "MB", "GB", "TB"]
40
+ x = float(n)
41
+ i = 0
42
+ while x >= 1024 and i < len(units) - 1:
43
+ x /= 1024
44
+ i += 1
45
+ return f"{x:.2f} {units[i]}"
46
+
47
+
48
+ def safe_str(x: Any, max_chars: int = 500) -> str:
49
+ s = "" if x is None else str(x)
50
+ s = re.sub(r"\s+", " ", s).strip()
51
+ if len(s) > max_chars:
52
+ return s[: max_chars - 3] + "..."
53
+ return s
54
+
55
+
56
+ def call_with_optional_kwargs(fn, *args, **kwargs):
57
+ try:
58
+ sig = inspect.signature(fn)
59
+ allowed = set(sig.parameters.keys())
60
+ safe_kwargs = {k: v for k, v in kwargs.items() if k in allowed}
61
+ return fn(*args, **safe_kwargs)
62
+ except Exception:
63
+ return fn(*args)
64
+
65
+
66
+ def parse_hf_input(user_input: str) -> Tuple[str, str]:
67
+ s = (user_input or "").strip()
68
+ if not s:
69
+ return "model", ""
70
+
71
+ if "huggingface.co" in s:
72
+ m = re.search(r"huggingface\.co/(datasets|spaces)/([^?#]+)", s)
73
+ if m:
74
+ rt = "dataset" if m.group(1) == "datasets" else "space"
75
+ path = m.group(2).strip("/")
76
+ path = re.split(r"/(tree|blob|resolve|raw)/", path)[0].strip("/")
77
+ return rt, path
78
+
79
+ m2 = re.search(r"huggingface\.co/([^?#]+)", s)
80
+ if m2:
81
+ path = m2.group(1).strip("/")
82
+ path = re.split(r"/(tree|blob|resolve|raw)/", path)[0].strip("/")
83
+ return "model", path
84
+
85
+ if s.startswith("datasets/"):
86
+ return "dataset", s.replace("datasets/", "", 1).strip("/")
87
+ if s.startswith("spaces/"):
88
+ return "space", s.replace("spaces/", "", 1).strip("/")
89
+
90
+ return "model", s.strip("/")
91
+
92
+
93
+ def hf_url(repo_type: str, repo_id: str) -> str:
94
+ rt = norm_type(repo_type)
95
+ rid = norm_id(repo_id)
96
+ if rt == "dataset":
97
+ return f"https://huggingface.co/datasets/{rid}"
98
+ if rt == "space":
99
+ return f"https://huggingface.co/spaces/{rid}"
100
+ return f"https://huggingface.co/{rid}"
101
+
102
+
103
+ def safe_hf_error(e: HfHubHTTPError) -> str:
104
+ status = getattr(getattr(e, "response", None), "status_code", "N/A")
105
+ msg = getattr(e, "server_message", None) or str(e)
106
+ return f"Hugging Face Error: {status} - {safe_str(msg, 500)}"
107
+
108
+
109
+ def extract_file_entries(info_obj) -> List[Dict[str, Any]]:
110
+ out: List[Dict[str, Any]] = []
111
+ siblings = getattr(info_obj, "siblings", None) or []
112
+ for s in siblings:
113
+ name = getattr(s, "rfilename", None) or getattr(s, "path", None) or None
114
+ if not name:
115
+ continue
116
+ size = getattr(s, "size", None)
117
+ if size is None:
118
+ lfs = getattr(s, "lfs", None)
119
+ size = getattr(lfs, "size", None) if lfs is not None else None
120
+ out.append({"path": str(name), "size": int(size) if isinstance(size, int) else None})
121
+ return out
122
+
123
+
124
+ def files_risk_report(files: List[Dict[str, Any]]) -> Dict[str, Any]:
125
+ paths = [f.get("path", "") for f in files if f.get("path")]
126
+ total_known = sum(int(f["size"]) for f in files if isinstance(f.get("size"), int))
127
+
128
+ has_gguf = any(p.lower().endswith(".gguf") for p in paths)
129
+ has_onnx = any(p.lower().endswith(".onnx") for p in paths)
130
+ has_safetensors = any(p.lower().endswith(".safetensors") for p in paths)
131
+ has_bin = any(p.lower().endswith(".bin") for p in paths)
132
+
133
+ suspicious_names = []
134
+ suspicious_patterns = [
135
+ r"\.env$",
136
+ r"secrets?",
137
+ r"token",
138
+ r"api[_-]?key",
139
+ r"credentials?",
140
+ r"id_rsa",
141
+ r"\.pem$",
142
+ r"\.p12$",
143
+ r"\.kdbx$",
144
+ ]
145
+ for p in paths:
146
+ pl = p.lower()
147
+ if any(re.search(rx, pl) for rx in suspicious_patterns):
148
+ suspicious_names.append(p)
149
+
150
+ return {
151
+ "files_count": len(paths),
152
+ "total_size_known": total_known if total_known > 0 else None,
153
+ "has_gguf": has_gguf,
154
+ "has_onnx": has_onnx,
155
+ "has_safetensors": has_safetensors,
156
+ "has_bin": has_bin,
157
+ "suspicious_names": suspicious_names[:30],
158
+ }
159
+
160
+
161
+ def warnings_from_meta(meta: Dict[str, Any]) -> List[str]:
162
+ w: List[str] = []
163
+ if (meta.get("Gated") == "Yes") or (meta.get("Private") == "Yes"):
164
+ w.append("Repo may require HF_TOKEN (private/gated).")
165
+
166
+ ts = meta.get("_risk", {}).get("total_size_known")
167
+ if isinstance(ts, int) and ts > 8 * 1024**3:
168
+ w.append("Large repo size detected (>8GB). Downloads may be slow; consider selective download.")
169
+
170
+ if meta.get("_has_gguf"):
171
+ w.append("GGUF detected. Prefer llama-cpp-python / llama.cpp flow for local CPU/GPU inference.")
172
+
173
+ pipeline = meta.get("Pipeline", "N/A")
174
+ if pipeline == "text-generation":
175
+ w.append("text-generation models often need GPU for good speed; device_map='auto' helps but not magic.")
176
+
177
+ return w
178
+
179
+
180
+ def status_card(meta_public: Dict[str, Any], warnings: List[str], rt: str, rid: str) -> str:
181
+ url = hf_url(rt, rid)
182
+
183
+ last_mod = meta_public.get("Last Modified", "N/A")
184
+ last_mod = str(last_mod).split()[0] if last_mod and last_mod != "N/A" else "N/A"
185
+
186
+ pills = []
187
+ if rt == "space":
188
+ sdk = meta_public.get("SDK", "N/A")
189
+ if sdk and sdk != "N/A":
190
+ pills.append(f"<span class='pill'>SDK: {esc(sdk)}</span>")
191
+
192
+ license_ = meta_public.get("License", "N/A")
193
+ if license_ and license_ != "N/A":
194
+ pills.append(f"<span class='pill'>{esc(license_)}</span>")
195
+
196
+ pipeline = meta_public.get("Pipeline", "N/A")
197
+ if pipeline and pipeline != "N/A":
198
+ pills.append(f"<span class='pill'>{esc(pipeline)}</span>")
199
+
200
+ size_s = meta_public.get("Total Size", "N/A")
201
+ if size_s and size_s != "N/A":
202
+ pills.append(f"<span class='pill'>{esc(size_s)}</span>")
203
+
204
+ gated = meta_public.get("Gated", "N/A")
205
+ if gated == "Yes":
206
+ pills.append("<span class='pill warn'>Gated</span>")
207
+
208
+ warn_html = ""
209
+ if warnings:
210
+ items = "".join([f"<li>{esc(x)}</li>" for x in warnings])
211
+ warn_html = f"""
212
+ <div class="warnbox">
213
+ <div class="warn_title">Warnings</div>
214
+ <ul class="warn_list">{items}</ul>
215
+ </div>
216
+ """
217
+
218
+ pills_html = "".join(pills) if pills else ""
219
+
220
+ likes = meta_public.get("Likes", 0)
221
+ downloads = meta_public.get("Downloads", 0)
222
+ author = meta_public.get("Author", "N/A")
223
+
224
+ return f"""
225
+ <div class="card ok">
226
+ <div class="head">
227
+ <div class="title">{esc(meta_public.get("Repo ID", rid))}</div>
228
+ <a class="link" href="{esc(url)}" target="_blank">Open</a>
229
+ </div>
230
+
231
+ <div class="pills">{pills_html}</div>
232
+
233
+ <div class="stats">
234
+ <div class="stat accent">
235
+ <div class="k">Likes</div>
236
+ <div class="v">{esc(likes)}</div>
237
+ </div>
238
+ <div class="stat">
239
+ <div class="k">Downloads</div>
240
+ <div class="v">{esc(downloads)}</div>
241
+ </div>
242
+ <div class="stat">
243
+ <div class="k">Last modified</div>
244
+ <div class="v">{esc(last_mod)}</div>
245
+ </div>
246
+ <div class="stat">
247
+ <div class="k">Author</div>
248
+ <div class="v">{esc(author)}</div>
249
+ </div>
250
+ </div>
251
+
252
+ {warn_html}
253
+ </div>
254
+ """
255
+
256
+
257
+ def status_err_card(msg: str) -> str:
258
+ return f"""
259
+ <div class="card err">
260
+ <div class="title">Failed</div>
261
+ <div class="msg">{esc(msg)}</div>
262
+ <div class="hint">
263
+ If this is a private/gated repo, provide a token locally or enable a server token for trusted use.
264
+ </div>
265
+ </div>
266
+ """
267
+
268
+
269
+ def render_risk_html(risk: Dict[str, Any]) -> str:
270
+ suspicious = risk.get("suspicious_names") or []
271
+ suspicious_html = ""
272
+ if suspicious:
273
+ items = "".join([f"<li><code>{esc(x)}</code></li>" for x in suspicious[:20]])
274
+ suspicious_html = f"""
275
+ <div class="riskbox">
276
+ <div class="risk_title">Potentially Sensitive Filenames</div>
277
+ <ul class="risk_list">{items}</ul>
278
+ <div class="risk_note">Filename-based only (no content scanning).</div>
279
+ </div>
280
+ """
281
+
282
+ feats = []
283
+ if risk.get("has_gguf"):
284
+ feats.append("GGUF")
285
+ if risk.get("has_onnx"):
286
+ feats.append("ONNX")
287
+ if risk.get("has_safetensors"):
288
+ feats.append("safetensors")
289
+ if risk.get("has_bin"):
290
+ feats.append(".bin")
291
+ feats_s = ", ".join(feats) if feats else "N/A"
292
+
293
+ size_s = human_bytes(risk.get("total_size_known")) if risk.get("total_size_known") else "N/A"
294
+
295
+ return f"""
296
+ <div class="card">
297
+ <div class="title">Files and risk</div>
298
+ <div class="mini_stats">
299
+ <span>Files <b>{esc(risk.get("files_count", 0))}</b></span>
300
+ <span>Size <b>{esc(size_s)}</b></span>
301
+ <span>Artifacts <b>{esc(feats_s)}</b></span>
302
+ </div>
303
+ {suspicious_html}
304
+ </div>
305
+ """
306
+
307
+
308
+ def to_files_table(files: List[Dict[str, Any]], limit: int = 250) -> List[List[Any]]:
309
+ rows: List[List[Any]] = []
310
+ for f in (files or [])[:limit]:
311
+ rows.append([f.get("path", ""), human_bytes(f.get("size")) if isinstance(f.get("size"), int) else "N/A"])
312
+ return rows
313
+
314
+
315
+ def filter_files(files: List[Dict[str, Any]], q: str, limit: int = 250) -> List[List[Any]]:
316
+ q = (q or "").strip().lower()
317
+ if not q:
318
+ return to_files_table(files, limit=limit)
319
+
320
+ out: List[List[Any]] = []
321
+ for f in files or []:
322
+ p = (f.get("path") or "")
323
+ if q in p.lower():
324
+ out.append([p, human_bytes(f.get("size")) if isinstance(f.get("size"), int) else "N/A"])
325
+ if len(out) >= limit:
326
+ break
327
+ return out
328
+
329
+
330
+ def first_file_with_ext(files: List[Dict[str, Any]], ext: str) -> Optional[str]:
331
+ ext = (ext or "").lower()
332
+ for f in files or []:
333
+ p = (f.get("path") or "")
334
+ if p.lower().endswith(ext):
335
+ return p
336
+ return None
337
+
338
+
339
+ def compute_requirements(rt: str, meta: Dict[str, Any]) -> List[str]:
340
+ rt = norm_type(rt)
341
+ pipeline_tag = (meta or {}).get("_pipeline_tag", "N/A")
342
+ has_gguf = bool((meta or {}).get("_has_gguf", False))
343
+
344
+ if rt == "dataset":
345
+ return ["datasets", "huggingface_hub"]
346
+
347
+ if rt == "space":
348
+ return ["gradio", "huggingface_hub", "requests"]
349
+
350
+ if has_gguf:
351
+ return ["huggingface_hub", "llama-cpp-python"]
352
+
353
+ if pipeline_tag == "text-generation":
354
+ return ["transformers", "huggingface_hub", "torch", "accelerate"]
355
+
356
+ if pipeline_tag in {"image-classification", "image-to-text", "image-segmentation", "object-detection"}:
357
+ return ["transformers", "huggingface_hub", "torch", "pillow", "requests"]
358
+
359
+ return ["transformers", "huggingface_hub", "torch"]
360
+
361
+
362
+ def generate_install(rt: str, meta: Dict[str, Any]) -> str:
363
+ return "pip install -U " + " ".join(compute_requirements(rt, meta))
364
+
365
+
366
+ def generate_quickstart(rt: str, rid: str, meta: Dict[str, Any]) -> str:
367
+ rt = norm_type(rt)
368
+ rid = norm_id(rid)
369
+
370
+ pipeline_tag = (meta or {}).get("_pipeline_tag", "N/A")
371
+ sdk = (meta or {}).get("_sdk", "N/A")
372
+ has_gguf = bool((meta or {}).get("_has_gguf", False))
373
+ files = (meta or {}).get("_files", []) or []
374
+
375
+ if rt == "dataset":
376
+ return textwrap.dedent(f""" from datasets import load_dataset
377
+
378
+ ds = load_dataset("{rid}")
379
+ print(ds)
380
+ """).strip()
381
+
382
+ if rt == "space":
383
+ repo_dir = rid.split("/")[-1]
384
+ if sdk == "streamlit":
385
+ return textwrap.dedent(f""" import os
386
+ import subprocess
387
+
388
+ subprocess.check_call(["git", "clone", "{hf_url("space", rid)}"])
389
+ os.chdir("{repo_dir}")
390
+ subprocess.check_call(["python", "-m", "pip", "install", "-r", "requirements.txt"])
391
+ subprocess.check_call(["streamlit", "run", "app.py"])
392
+ """).strip()
393
+ return textwrap.dedent(f""" import os
394
+ import subprocess
395
+
396
+ subprocess.check_call(["git", "clone", "{hf_url("space", rid)}"])
397
+ os.chdir("{repo_dir}")
398
+ subprocess.check_call(["python", "-m", "pip", "install", "-r", "requirements.txt"])
399
+ subprocess.check_call(["python", "app.py"])
400
+ """).strip()
401
+
402
+ if has_gguf:
403
+ gguf_name = first_file_with_ext(files, ".gguf") or "MODEL.gguf"
404
+ return textwrap.dedent(f""" from huggingface_hub import hf_hub_download
405
+ from llama_cpp import Llama
406
+
407
+ gguf_path = hf_hub_download(repo_id="{rid}", filename="{gguf_name}")
408
+ llm = Llama(model_path=gguf_path, n_ctx=4096)
409
+
410
+ out = llm("Q: Hello!\nA:", max_tokens=128)
411
+ print(out["choices"][0]["text"])
412
+ """).strip()
413
+
414
+ if pipeline_tag == "text-generation":
415
+ return textwrap.dedent(f""" from transformers import pipeline
416
+
417
+ pipe = pipeline(
418
+ "text-generation",
419
+ model="{rid}",
420
+ device_map="auto",
421
+ )
422
+ out = pipe("Hello, Hugging Face!", max_new_tokens=64)
423
+ print(out[0]["generated_text"])
424
+ """).strip()
425
+
426
+ if pipeline_tag == "text-classification":
427
+ return textwrap.dedent(f""" from transformers import pipeline
428
+
429
+ clf = pipeline("text-classification", model="{rid}")
430
+ print(clf("I love this project."))
431
+ """).strip()
432
+
433
+ if pipeline_tag == "image-classification":
434
+ return textwrap.dedent(f""" from transformers import pipeline
435
+ from PIL import Image
436
+ import requests
437
+ from io import BytesIO
438
+
439
+ img = Image.open(BytesIO(requests.get(
440
+ "https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png"
441
+ ).content))
442
+ pipe = pipeline("image-classification", model="{rid}")
443
+ print(pipe(img))
444
+ """).strip()
445
+
446
+ return textwrap.dedent(f""" from transformers import AutoTokenizer, AutoModel
447
+
448
+ tok = AutoTokenizer.from_pretrained("{rid}")
449
+ model = AutoModel.from_pretrained("{rid}")
450
+ print(type(model))
451
+ """).strip()
452
+
453
+
454
+ def generate_snapshot_download(rt: str, rid: str) -> str:
455
+ rt = norm_type(rt)
456
+ rid = norm_id(rid)
457
+ local_dir = rid.split("/")[-1]
458
+
459
+ return textwrap.dedent(f""" from huggingface_hub import snapshot_download
460
+
461
+ path = snapshot_download(
462
+ repo_id="{rid}",
463
+ repo_type="{rt}",
464
+ local_dir="./{local_dir}",
465
+ local_dir_use_symlinks=False,
466
+ )
467
+ print(f"Downloaded to: {{path}}")
468
+ """).strip()
469
+
470
+
471
+ def generate_cli_download(rt: str, rid: str) -> str:
472
+ rt = norm_type(rt)
473
+ rid = norm_id(rid)
474
+ return f'huggingface-cli download {rid} --repo-type {rt} --local-dir "./downloaded_repo" --local-dir-use-symlinks False'
475
+
476
+
477
+ def generate_badge(rt: str, rid: str) -> str:
478
+ rt = norm_type(rt)
479
+ rid = norm_id(rid)
480
+ url = hf_url(rt, rid)
481
+ encoded = rid.replace("/", "%2F")
482
+ return f"[![Hugging Face](https://img.shields.io/badge/%F0%9F%A4%97%20Hugging%20Face-{encoded}-blue)]({url})"
483
+
484
+
485
+ def token_allowed_for_repo(repo_id: str) -> bool:
486
+ owners = os.getenv("TOKEN_ALLOWED_OWNERS", "").strip()
487
+ if not owners:
488
+ return True
489
+ allowed = {x.strip().lower() for x in owners.split(",") if x.strip()}
490
+ owner = (norm_id(repo_id).split("/")[0] if "/" in norm_id(repo_id) else "").lower()
491
+ return bool(owner) and owner in allowed
492
+
493
+
494
+ def get_effective_token(repo_id: str) -> Optional[str]:
495
+ if os.getenv("ALLOW_SERVER_TOKEN", "").strip() != "1":
496
+ return None
497
+ t = (os.getenv("HF_TOKEN") or "").strip()
498
+ if not t:
499
+ return None
500
+ return t if token_allowed_for_repo(repo_id) else None
501
+
502
+
503
+ def fetch_repo_info(repo_type: str, repo_id: str, token: Optional[str]) -> Tuple[bool, Optional[Dict[str, Any]], Optional[str]]:
504
+ api = HfApi()
505
+ rt = norm_type(repo_type)
506
+ rid = norm_id(repo_id)
507
+ token = (token or "").strip() or None
508
+
509
+ if not rid:
510
+ return False, None, "Empty Repo ID."
511
+ if not is_valid_repo_id(rid):
512
+ return False, None, "Invalid Repo ID. Expected: owner/name"
513
+
514
+ try:
515
+ if rt == "dataset":
516
+ info = call_with_optional_kwargs(api.dataset_info, rid, token=token, files_metadata=True)
517
+ elif rt == "space":
518
+ info = call_with_optional_kwargs(api.space_info, rid, token=token, files_metadata=True)
519
+ else:
520
+ info = call_with_optional_kwargs(api.model_info, rid, token=token, files_metadata=True)
521
+
522
+ card = getattr(info, "cardData", None) or {}
523
+ license_ = card.get("license") or getattr(info, "license", None) or "N/A"
524
+
525
+ gated = getattr(info, "gated", None)
526
+ private = getattr(info, "private", None)
527
+
528
+ pipeline = getattr(info, "pipeline_tag", None) or "N/A"
529
+ sdk = getattr(info, "sdk", None) or "N/A"
530
+
531
+ files = extract_file_entries(info)
532
+
533
+ if not files:
534
+ try:
535
+ names = api.list_repo_files(repo_id=rid, repo_type=rt, token=token)
536
+ files = [{"path": n, "size": None} for n in (names or [])]
537
+ except Exception:
538
+ files = []
539
+
540
+ risk = files_risk_report(files)
541
+ total_size_str = human_bytes(risk.get("total_size_known")) if risk.get("total_size_known") else "N/A"
542
+
543
+ preview: Dict[str, Any] = {
544
+ "Repo ID": getattr(info, "id", rid),
545
+ "Type": rt,
546
+ "Author": getattr(info, "author", None) or getattr(info, "owner", None) or "N/A",
547
+ "Likes": getattr(info, "likes", 0) or 0,
548
+ "Downloads": getattr(info, "downloads", 0) or 0,
549
+ "Last Modified": safe_str(getattr(info, "lastModified", "N/A"), 200),
550
+ "License": str(license_) if license_ else "N/A",
551
+ "Pipeline": str(pipeline) if pipeline else "N/A",
552
+ "Gated": "Yes" if gated is True else ("No" if gated is False else "N/A"),
553
+ "Private": "Yes" if private is True else ("No" if private is False else "N/A"),
554
+ "Total Size": total_size_str,
555
+ "Files Count": risk.get("files_count", 0),
556
+ }
557
+
558
+ if rt == "space":
559
+ preview["SDK"] = sdk or "N/A"
560
+ hw = getattr(info, "hardware", None)
561
+ if hw:
562
+ preview["Hardware"] = safe_str(hw, 200)
563
+
564
+ preview["_pipeline_tag"] = pipeline or "N/A"
565
+ preview["_sdk"] = sdk or "N/A"
566
+ preview["_files"] = files
567
+ preview["_risk"] = risk
568
+ preview["_has_gguf"] = bool(risk.get("has_gguf"))
569
+
570
+ return True, preview, None
571
+
572
+ except HfHubHTTPError as e:
573
+ return False, None, safe_hf_error(e)
574
+ except Exception as e:
575
+ return False, None, f"Unexpected Error: {safe_str(e, 500)}"
576
+
577
+
578
+ @lru_cache(maxsize=512)
579
+ def cached_public(repo_type: str, repo_id: str):
580
+ return fetch_repo_info(repo_type, repo_id, token=None)
581
+
582
+
583
+ def build_quickstart_zip(state: Dict[str, Any]) -> Tuple[Optional[str], str]:
584
+ if not isinstance(state, dict) or not state.get("Repo ID"):
585
+ return None, "No repo loaded yet."
586
+
587
+ rt = norm_type(state.get("Type", "model"))
588
+ rid = norm_id(state.get("Repo ID", "")) or norm_id(state.get("_rid", ""))
589
+
590
+ install = generate_install(rt, state)
591
+ quickstart = generate_quickstart(rt, rid, state)
592
+ snap = generate_snapshot_download(rt, rid)
593
+
594
+ readme = textwrap.dedent(f""" # QuickStart — {rid}
595
+
596
+ ## Setup
597
+ ```bash
598
+ python -m venv .venv
599
+ pip install -U pip
600
+ pip install -r requirements.txt
601
+ ```
602
+
603
+ ## Run
604
+ ```bash
605
+ python run.py
606
+ ```
607
+
608
+ ## Download (optional)
609
+ ```bash
610
+ python download.py
611
+ ```
612
+ """).strip()
613
+
614
+ requirements = compute_requirements(rt, state)
615
+ env_example = "HF_TOKEN=\n"
616
+
617
+ run_py = textwrap.dedent(f""" import os
618
+
619
+ def main():
620
+ print("Install (reference):")
621
+ print("{install}")
622
+
623
+ {textwrap.indent(quickstart, " ")}
624
+
625
+ if __name__ == "__main__":
626
+ main()
627
+ """)
628
+
629
+ download_py = textwrap.dedent(f""" {snap}
630
+ """)
631
+
632
+ tmpdir = tempfile.mkdtemp(prefix="quickstart_")
633
+ zip_path = os.path.join(tmpdir, f"{rid.replace('/', '__')}_quickstart.zip")
634
+
635
+ proj_dir = os.path.join(tmpdir, "project")
636
+ os.makedirs(proj_dir, exist_ok=True)
637
+
638
+ def write_file(path: str, content: str):
639
+ with open(path, "w", encoding="utf-8") as f:
640
+ f.write(content)
641
+
642
+ write_file(os.path.join(proj_dir, "README.md"), readme + "\n")
643
+ write_file(os.path.join(proj_dir, "requirements.txt"), "\n".join(requirements) + "\n")
644
+ write_file(os.path.join(proj_dir, ".env.example"), env_example)
645
+ write_file(os.path.join(proj_dir, "run.py"), run_py + "\n")
646
+ write_file(os.path.join(proj_dir, "download.py"), download_py + "\n")
647
+
648
+ with zipfile.ZipFile(zip_path, "w", compression=zipfile.ZIP_DEFLATED) as z:
649
+ for fname in ["README.md", "requirements.txt", ".env.example", "run.py", "download.py"]:
650
+ z.write(os.path.join(proj_dir, fname), arcname=fname)
651
+
652
+ return zip_path, "Zip built. Download it, unzip it, then run: python run.py"
653
+
654
+
655
+ def process(user_input: str, type_override: str):
656
+ auto_type, rid = parse_hf_input(user_input)
657
+ rt = auto_type if (type_override or "auto") == "auto" else norm_type(type_override)
658
+ rid = norm_id(rid)
659
+
660
+ token = get_effective_token(rid)
661
+
662
+ if token:
663
+ ok, meta, err = fetch_repo_info(rt, rid, token=token)
664
+ else:
665
+ ok, meta, err = cached_public(rt, rid)
666
+
667
+ if not ok or not meta:
668
+ empty_rows: List[List[Any]] = []
669
+ return (
670
+ status_err_card(err or "Unknown error"),
671
+ "",
672
+ "",
673
+ "",
674
+ "",
675
+ "",
676
+ empty_rows,
677
+ "",
678
+ {},
679
+ {},
680
+ )
681
+
682
+ meta_public = {k: v for k, v in meta.items() if not str(k).startswith("_")}
683
+
684
+ install = generate_install(rt, meta)
685
+ quickstart = generate_quickstart(rt, rid, meta)
686
+ snap = generate_snapshot_download(rt, rid)
687
+ cli = generate_cli_download(rt, rid)
688
+ badge = generate_badge(rt, rid)
689
+
690
+ files = meta.get("_files", []) or []
691
+ risk = meta.get("_risk", {}) or {}
692
+ warnings = warnings_from_meta(meta)
693
+
694
+ status = status_card(meta_public, warnings, rt, rid)
695
+ files_rows = to_files_table(files, limit=250)
696
+ risk_html = render_risk_html(risk)
697
+
698
+ state = dict(meta)
699
+ state["_rid"] = rid
700
+ state["_rt"] = rt
701
+
702
+ return (
703
+ status,
704
+ install,
705
+ quickstart,
706
+ snap,
707
+ cli,
708
+ badge,
709
+ files_rows,
710
+ risk_html,
711
+ meta_public,
712
+ state,
713
+ )
714
+
715
+
716
+ def do_filter_files(state: Dict[str, Any], q: str):
717
+ files = (state or {}).get("_files", []) or []
718
+ return filter_files(files, q, limit=250)
719
+
720
+
721
+ def build_ui():
722
+ theme = gr.themes.Soft(
723
+ primary_hue="orange",
724
+ secondary_hue="slate",
725
+ font=[gr.themes.GoogleFont("Inter"), "ui-sans-serif", "system-ui"],
726
+ radius_size=gr.themes.sizes.radius_lg,
727
+ )
728
+
729
+ css = """
730
+ .gradio-container { max-width: 1120px !important; margin: auto; }
731
+
732
+ .hero{
733
+ padding: 18px 18px;
734
+ border-radius: 18px;
735
+ border: 1px solid rgba(148,163,184,.25);
736
+ background:
737
+ radial-gradient(1200px 300px at 30% 0%, rgba(249,115,22,.18), transparent 60%),
738
+ radial-gradient(1000px 260px at 70% 20%, rgba(99,102,241,.14), transparent 55%),
739
+ linear-gradient(180deg, rgba(255,255,255,.03), rgba(255,255,255,0));
740
+ box-shadow: 0 12px 40px rgba(0,0,0,.10);
741
+ margin-bottom: 14px;
742
+ }
743
+ h1{
744
+ text-align:center;
745
+ margin: 0 0 6px 0;
746
+ color: var(--body-text-color);
747
+ font-weight: 850;
748
+ letter-spacing: -0.03em;
749
+ }
750
+ .sub{
751
+ text-align:center;
752
+ color: var(--body-text-color-subdued);
753
+ margin: 0;
754
+ line-height: 1.45;
755
+ }
756
+
757
+ .card{
758
+ padding: 14px 16px;
759
+ border-radius: 16px;
760
+ background: var(--block-background-fill);
761
+ border: 1px solid var(--block-border-color);
762
+ color: var(--body-text-color);
763
+ box-shadow: 0 10px 30px rgba(0,0,0,.06);
764
+ }
765
+ .ok{ border-left: 6px solid rgba(16,185,129,.95); }
766
+ .err{ border-left: 6px solid rgba(239,68,68,.95); }
767
+
768
+ .head{ display:flex; align-items:center; justify-content:space-between; gap:12px; flex-wrap:wrap; }
769
+ .title{ font-weight: 850; font-size: 1.02rem; }
770
+ .link{
771
+ color: #ffffff !important;
772
+ text-decoration: none !important;
773
+ font-weight: 900;
774
+ padding: 6px 12px;
775
+ border-radius: 10px;
776
+ border: none !important;
777
+ background: linear-gradient(135deg, rgba(249,115,22,1), rgba(245,158,11,1)) !important;
778
+ box-shadow: 0 10px 26px rgba(249,115,22,.18);
779
+ }
780
+ .link:hover{
781
+ filter: brightness(1.05);
782
+ transform: translateY(-0.5px);
783
+ }
784
+
785
+ .pills{ margin-top: 10px; display:flex; gap: 10px; flex-wrap:wrap; }
786
+ .pill{
787
+ display:inline-flex; align-items:center;
788
+ padding: 4px 10px; border-radius: 999px; font-size: .82rem;
789
+ border: 1px solid rgba(148,163,184,.28);
790
+ background: rgba(255,255,255,0.04);
791
+ color: var(--body-text-color);
792
+ }
793
+ .pill.warn{
794
+ border-color: rgba(245,158,11,.35);
795
+ background: rgba(245,158,11,.10);
796
+ }
797
+
798
+ .stats{
799
+ margin-top: 12px;
800
+ display: grid;
801
+ grid-template-columns: repeat(4, minmax(150px, 1fr));
802
+ gap: 10px;
803
+ }
804
+ .stat{
805
+ padding: 10px 12px;
806
+ border-radius: 14px;
807
+ border: 1px solid rgba(148,163,184,.22);
808
+ background:
809
+ radial-gradient(600px 140px at 0% 0%, rgba(255,255,255,.05), transparent 55%),
810
+ linear-gradient(180deg, rgba(255,255,255,.03), rgba(255,255,255,0));
811
+ }
812
+ .stat .k{
813
+ font-size: .76rem;
814
+ color: var(--body-text-color-subdued);
815
+ text-transform: uppercase;
816
+ letter-spacing: .08em;
817
+ }
818
+ .stat .v{
819
+ margin-top: 6px;
820
+ font-weight: 900;
821
+ font-size: 1.06rem;
822
+ color: var(--body-text-color);
823
+ font-variant-numeric: tabular-nums;
824
+ }
825
+ .stat.accent{
826
+ border-color: rgba(249,115,22,.30);
827
+ background:
828
+ radial-gradient(700px 160px at 10% 0%, rgba(249,115,22,.20), transparent 60%),
829
+ linear-gradient(180deg, rgba(255,255,255,.03), rgba(255,255,255,0));
830
+ }
831
+ @media (max-width: 820px){
832
+ .stats{ grid-template-columns: repeat(2, minmax(150px, 1fr)); }
833
+ }
834
+ @media (max-width: 460px){
835
+ .stats{ grid-template-columns: 1fr; }
836
+ }
837
+
838
+ .warnbox{
839
+ margin-top: 12px;
840
+ padding: 12px 12px;
841
+ border-radius: 14px;
842
+ border: 1px solid rgba(245, 158, 11, .30);
843
+ background: rgba(245, 158, 11, .08);
844
+ }
845
+ .warn_title{ font-weight: 850; margin-bottom: 6px; }
846
+ .warn_list{ margin: 0; padding-left: 18px; color: var(--body-text-color); }
847
+
848
+ .mini_stats{
849
+ display:flex;
850
+ gap: 14px;
851
+ flex-wrap:wrap;
852
+ margin-top: 10px;
853
+ color: var(--body-text-color-subdued);
854
+ font-size: .92rem;
855
+ }
856
+
857
+ .riskbox{
858
+ margin-top: 12px;
859
+ padding: 12px 12px;
860
+ border-radius: 14px;
861
+ border: 1px solid rgba(148,163,184,.20);
862
+ background: rgba(255,255,255,0.03);
863
+ }
864
+ .risk_title{ font-weight: 850; margin-bottom: 6px; }
865
+ .risk_list{ margin: 0; padding-left: 18px; color: var(--body-text-color); }
866
+ .risk_note{ margin-top: 6px; color: var(--body-text-color-subdued); font-size: .9rem; }
867
+
868
+ button.primary, .gr-button-primary, .primary > button {
869
+ border: none !important;
870
+ background: linear-gradient(135deg, rgba(249,115,22,1), rgba(245,158,11,1)) !important;
871
+ color: white !important;
872
+ font-weight: 850 !important;
873
+ box-shadow: 0 10px 26px rgba(249,115,22,.18);
874
+ }
875
+ button.primary:hover, .gr-button-primary:hover, .primary > button:hover {
876
+ filter: brightness(1.05);
877
+ transform: translateY(-0.5px);
878
+ }
879
+ """
880
+
881
+ df_sig = inspect.signature(gr.Dataframe)
882
+ df_count_kw = {"column_count": (2, "fixed")} if "column_count" in df_sig.parameters else {"col_count": (2, "fixed")}
883
+
884
+ with gr.Blocks(title="QuickStart") as demo:
885
+ gr.Markdown(
886
+ "<div class='hero'>"
887
+ "<h1>QuickStart</h1>"
888
+ "<p class='sub'>Paste a Hugging Face URL or Repo ID to generate run/download snippets and export a ready zip.</p>"
889
+ "</div>"
890
+ )
891
+
892
+ state = gr.State({})
893
+
894
+ with gr.Row(variant="panel"):
895
+ with gr.Column(scale=7):
896
+ inp = gr.Textbox(
897
+ label="HF URL or Repo ID",
898
+ placeholder="google/gemma-2-9b-it or https://huggingface.co/datasets/squad",
899
+ autofocus=True,
900
+ )
901
+ with gr.Column(scale=2):
902
+ t = gr.Dropdown(["auto", "model", "dataset", "space"], value="auto", label="Type")
903
+ with gr.Column(scale=2):
904
+ btn = gr.Button("Generate", variant="primary")
905
+
906
+ out_status = gr.HTML(label="Summary")
907
+
908
+ with gr.Tabs():
909
+ with gr.TabItem("QuickStart"):
910
+ out_py = gr.Code(language="python", label="Python QuickStart", interactive=False)
911
+ copy_py = gr.Button("Copy")
912
+ out_install = gr.Code(language="shell", label="Install", interactive=False)
913
+ copy_install = gr.Button("Copy")
914
+
915
+ with gr.TabItem("Download"):
916
+ out_snap = gr.Code(language="python", label="snapshot_download()", interactive=False)
917
+ copy_snap = gr.Button("Copy")
918
+ out_cli = gr.Code(language="shell", label="huggingface-cli download", interactive=False)
919
+ copy_cli = gr.Button("Copy")
920
+
921
+ with gr.TabItem("Files"):
922
+ file_q = gr.Textbox(label="Filter", placeholder="e.g. .gguf or config.json")
923
+ files_table = gr.Dataframe(
924
+ headers=["path", "size"],
925
+ datatype=["str", "str"],
926
+ label="Files (first 250)",
927
+ interactive=False,
928
+ row_count=10,
929
+ **df_count_kw,
930
+ )
931
+ risk_html = gr.HTML(label="Risk")
932
+
933
+ with gr.TabItem("Export"):
934
+ gr.Markdown("Exports a zip: `run.py`, `download.py`, `requirements.txt`, `.env.example`, `README.md`.")
935
+ zip_btn = gr.Button("Build Zip", variant="primary")
936
+ zip_file = gr.File(label="Zip file")
937
+ zip_msg = gr.Markdown()
938
+
939
+ with gr.TabItem("Badge"):
940
+ out_badge = gr.Code(language="markdown", label="Markdown", interactive=False)
941
+ copy_badge = gr.Button("Copy")
942
+
943
+ with gr.Accordion("Details", open=False):
944
+ out_meta = gr.JSON(label="Metadata")
945
+
946
+ outputs = [
947
+ out_status,
948
+ out_install,
949
+ out_py,
950
+ out_snap,
951
+ out_cli,
952
+ out_badge,
953
+ files_table,
954
+ risk_html,
955
+ out_meta,
956
+ state,
957
+ ]
958
+
959
+ btn.click(process, inputs=[inp, t], outputs=outputs)
960
+ inp.submit(process, inputs=[inp, t], outputs=outputs)
961
+
962
+ file_q.change(do_filter_files, inputs=[state, file_q], outputs=[files_table])
963
+ zip_btn.click(build_quickstart_zip, inputs=[state], outputs=[zip_file, zip_msg])
964
+
965
+ js_copy = "(t)=>{ if(!t){return [];} navigator.clipboard.writeText(String(t)); return []; }"
966
+ copy_install.click(None, inputs=[out_install], outputs=[], js=js_copy)
967
+ copy_py.click(None, inputs=[out_py], outputs=[], js=js_copy)
968
+ copy_snap.click(None, inputs=[out_snap], outputs=[], js=js_copy)
969
+ copy_cli.click(None, inputs=[out_cli], outputs=[], js=js_copy)
970
+ copy_badge.click(None, inputs=[out_badge], outputs=[], js=js_copy)
971
+
972
+ return demo, theme, css
973
+
974
+
975
+ if __name__ == "__main__":
976
+ app, theme, css = build_ui()
977
+ app.launch(theme=theme, css=css)
assets/Example.png ADDED

Git LFS Details

  • SHA256: 22f8b2d9bdf72ea6090fb1eda69252966eb8aaaad314e43963d055ae6a2f0431
  • Pointer size: 131 Bytes
  • Size of remote file: 274 kB
requirements.txt ADDED
@@ -0,0 +1,2 @@
 
 
 
1
+ gradio==6.2.0
2
+ huggingface_hub>=0.23.0