Starburst15 commited on
Commit
b1ee5b7
·
verified ·
1 Parent(s): 24c9b53

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +53 -14
README.md CHANGED
@@ -1,19 +1,58 @@
1
- ---
2
- title: Handbook Chatbot
3
- emoji: 🚀
4
- colorFrom: red
5
- colorTo: red
6
- sdk: docker
7
- app_port: 8501
8
- tags:
9
- - streamlit
10
  pinned: false
11
- short_description: Streamlit template space
12
  ---
13
 
14
- # Welcome to Streamlit!
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
15
 
16
- Edit `/src/streamlit_app.py` to customize this app to your heart's desire. :heart:
 
17
 
18
- If you have any questions, checkout our [documentation](https://docs.streamlit.io) and [community
19
- forums](https://discuss.streamlit.io).
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ title: "USTP Student Handbook Assistant"
2
+ emoji: "📘"
3
+ colorFrom: "purple"
4
+ colorTo: "indigo"
5
+ sdk: "streamlit"
6
+ sdk_version: "1.39.0"
7
+ app_file: src/streamlit_app.py
 
 
8
  pinned: false
9
+ license: "mit"
10
  ---
11
 
12
+ # 📘 USTP Student Handbook Assistant (2023 Edition)
13
+
14
+ This Streamlit app lets students, faculty, and staff **ask questions about the USTP Student Handbook (2023 Edition)** and get **accurate, page-referenced answers** directly from the document — powered by **FAISS**, **Sentence Transformers**, and **open-source LLMs** such as Mistral, Mixtral, and Qwen.
15
+
16
+ ---
17
+
18
+ ## 🚀 Features
19
+ ✅ Reads and indexes the *USTP Student Handbook 2023 Edition* PDF
20
+ ✅ Fast semantic search with **FAISS vector database**
21
+ ✅ Accurate citation with **printed page numbers**, not raw PDF indices
22
+ ✅ Choose between **multiple open-source models** (Mistral, Mixtral, Qwen, etc.)
23
+ ✅ Offline-safe — works even without API tokens
24
+ ✅ Automatic local embedding with **MiniLM** for fast responses
25
+ ✅ Caches index for instant re-use
26
+
27
+ ---
28
 
29
+ ## 🧠 LLM Integration (Optional)
30
+ You can enhance the assistant’s responses with **Hugging Face Inference API** or run it completely **offline** using local models.
31
 
32
+ ### 🔑 To configure:
33
+ 1. Create a `.env` file in the app root directory.
34
+ 2. Add your Hugging Face token (optional): HF_TOKEN = your_huggingface_token
35
+ 3. Save the file and **restart the app**.
36
+
37
+ > 💡 If you don’t provide a token, the app will automatically use a **local SentenceTransformer model** for embeddings.
38
+
39
+ ---
40
+
41
+ ## 🛠️ Deployment Notes
42
+ - **Runtime:** Python SDK
43
+ - **SDK:** Streamlit
44
+ - **App file:** `src/streamlit_app.py`
45
+ - **PDF file:** Must be named `USTP Student Handbook 2023 Edition.pdf` and placed in the same directory.
46
+ - **Recommended visibility:** **Public** (for demo and student access)
47
+ - **Supported models:**
48
+ - `mistralai/Mistral-7B-Instruct-v0.3`
49
+ - `mistralai/Mixtral-8x7B-Instruct-v0.1`
50
+ - `Qwen/Qwen2.5-14B-Instruct`
51
+
52
+ ---
53
+ ### ⚠️ “Permission denied: '/.streamlit'”
54
+ If deploying in a restricted environment:
55
+ - Set the working directory to a writable path (e.g., `/home/appuser/app`).
56
+ - Or run:
57
+ ```bash
58
+ mkdir -p ~/.streamlit