Jonathan Bejarano
Fixing image upload
8b51799
|
raw
history blame
2.88 kB
metadata
title: World Geography
emoji: ๐Ÿ’ฌ
colorFrom: yellow
colorTo: purple
sdk: gradio
sdk_version: 5.42.0
app_file: app.py
pinned: false
hf_oauth: true
hf_oauth_scopes:
  - inference-api
license: apache-2.0
short_description: Learning Country Names, locations, flags, principal location

World Geography Game ๐ŸŒ

An interactive geography game where you try to guess the country I'm thinking of using 20 yes/no questions.

Built with Gradio, huggingface_hub, and the Hugging Face Inference API.

Architecture

The application follows a modular architecture with clear separation between the UI, game logic, and AI inference:

Architecture diagram showing the modular design of the World Geography Game with Application Layer, Data Layer, External Services, and User Interface Layer

Key Components:

  • Gradio ChatInterface: Provides the web-based chat UI
  • Game Logic: Manages the 20-questions game flow and state
  • Country Selector: Randomly selects countries and fetches facts
  • Response Cleaner: Processes and formats AI model responses
  • Dual Mode Support: Seamlessly switches between local and cloud inference
  • Facts Fetcher: Enriches game data with real country information

Features

  • ๐ŸŽฏ 20 Questions gameplay format
  • ๐ŸŒ Covers countries from around the world
  • ๐Ÿค– AI-powered responses using Llama models
  • ๐Ÿ  Local model support for development
  • โ˜๏ธ Cloud deployment with HuggingFace OAuth

Running Locally

To run this application with a local inference server (like LM Studio, Ollama, etc.):

  1. Create a .env file from the sample:

    cp sample.env .env
    
  2. Configure your local model settings in .env:

    MODEL_NAME=llama-3.2-3b-instruct
    BASE_URL=http://127.0.0.1:1234/v1
    TOKEN=abc123
    
  3. Install dependencies:

    pip install -r requirements.txt
    
  4. Run the application:

    python app.py
    

When running locally, the app will automatically detect the environment variables and use your local model instead of requiring HuggingFace OAuth login.

Cloud Deployment

When deployed to HuggingFace Spaces or running without local environment variables, the app will use HuggingFace's Inference API and require users to log in with their HuggingFace account.

How to Play

  1. Think of questions that can be answered with "Yes" or "No"
  2. Try to narrow down the location, language, geography, or other characteristics
  3. You have 20 questions to guess the correct country
  4. The AI will keep track of your question count and let you know when you've won or used all your questions

Good luck! ๐ŸŽฎ