Spaces:
Sleeping
Sleeping
File size: 2,879 Bytes
0ece0bd 2d6cfee 0ece0bd 2d6cfee 0ece0bd 036d0fc 47ded99 8b51799 47ded99 036d0fc |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 |
---
title: World Geography
emoji: ๐ฌ
colorFrom: yellow
colorTo: purple
sdk: gradio
sdk_version: 5.42.0
app_file: app.py
pinned: false
hf_oauth: true
hf_oauth_scopes:
- inference-api
license: apache-2.0
short_description: Learning Country Names, locations, flags, principal location
---
# World Geography Game ๐
An interactive geography game where you try to guess the country I'm thinking of using 20 yes/no questions.
Built with [Gradio](https://gradio.app), [`huggingface_hub`](https://huggingface.co/docs/huggingface_hub/v0.22.2/en/index), and the [Hugging Face Inference API](https://huggingface.co/docs/api-inference/index).
## Architecture
The application follows a modular architecture with clear separation between the UI, game logic, and AI inference:

### Key Components:
- **Gradio ChatInterface**: Provides the web-based chat UI
- **Game Logic**: Manages the 20-questions game flow and state
- **Country Selector**: Randomly selects countries and fetches facts
- **Response Cleaner**: Processes and formats AI model responses
- **Dual Mode Support**: Seamlessly switches between local and cloud inference
- **Facts Fetcher**: Enriches game data with real country information
## Features
- ๐ฏ 20 Questions gameplay format
- ๐ Covers countries from around the world
- ๐ค AI-powered responses using Llama models
- ๐ Local model support for development
- โ๏ธ Cloud deployment with HuggingFace OAuth
## Running Locally
To run this application with a local inference server (like LM Studio, Ollama, etc.):
1. **Create a `.env` file** from the sample:
```bash
cp sample.env .env
```
2. **Configure your local model settings** in `.env`:
```env
MODEL_NAME=llama-3.2-3b-instruct
BASE_URL=http://127.0.0.1:1234/v1
TOKEN=abc123
```
3. **Install dependencies**:
```bash
pip install -r requirements.txt
```
4. **Run the application**:
```bash
python app.py
```
When running locally, the app will automatically detect the environment variables and use your local model instead of requiring HuggingFace OAuth login.
## Cloud Deployment
When deployed to HuggingFace Spaces or running without local environment variables, the app will use HuggingFace's Inference API and require users to log in with their HuggingFace account.
## How to Play
1. Think of questions that can be answered with "Yes" or "No"
2. Try to narrow down the location, language, geography, or other characteristics
3. You have 20 questions to guess the correct country
4. The AI will keep track of your question count and let you know when you've won or used all your questions
Good luck! ๐ฎ
|