GitHunt
WH

whiver/ou-skier

Trouvez votre prochaine destination de ski de fond avec des informations sur l'enneigement et l'accessibilité en transports de nombreux domaines français.

⛷️ Où Skier ?

Trouvez votre prochaine destination de ski nordique grâce aux conditions d'enneigement en temps réel des domaines français.

Architecture

The project is split into two components:

Component Directory Description
Web app web/ Next.js application displaying ski resort snow conditions
Worker worker/ Data ingestion worker that fetches Nordic France snow bulletins

Both share the same PostgreSQL database schema and migration history from prisma/schema.prisma and prisma/migrations/.

Database Schema

Resort
├── id             Integer (PK)
├── name           String
├── region         Region?      (enum, metropolitan regions)
├── domainUrl      String?
├── latitude       Float?
├── longitude      Float?
├── createdAt      DateTime
└── updatedAt      DateTime

SnowRecord
├── id             Integer (PK)
├── resortId       Integer (FK → Resort)
├── recordDate     DateTime
├── openSlopes     Int?
├── totalSlopes    Int?
├── notes          String?
└── sourceUrl      String?

Getting Started

Prerequisites

  • Node.js 20+
  • PostgreSQL database (e.g. Neon for Vercel deployments)

Setup

  1. Clone the repository

    git clone https://github.com/whiver/ou-skier.git
    cd ou-skier
  2. Set up the web app

    cd web
    cp .env.example .env         # Prisma CLI reads this for migrate/generate
    cp .env.example .env.local   # Next.js app runtime
    npm install
    npx prisma migrate deploy
    npx prisma generate
    npm run dev
  3. Set up the worker

    cd worker
    cp .env.example .env  # then fill in DATABASE_URL
    npm install
    npx prisma generate
    npm run dev

Environment Variables

Both the web app and worker need a DATABASE_URL pointing to the same PostgreSQL database:

DATABASE_URL=postgresql://user:password@host:5432/ou_skier

Running the Worker

The worker fetches the latest Nordic France snow bulletin and upserts the data into the database.

It combines paginated weather cards with inline station metadata (Weather.posts) from the bulletin page to infer a massif, then maps that massif to a best-effort Region value.

For newly discovered resorts, the worker also attempts to fill latitude / longitude once (BAN geocoder with Nominatim fallback) so resorts can be displayed on a map.

Because some massifs span multiple administrative regions, this region attribution remains approximate.

cd worker
npm run dev      # development (ts-node)
npm run build    # compile TypeScript
npm start        # run compiled version

For automated updates, run the worker on a schedule (e.g. daily at 8:00 AM) using a cron job or a scheduled GitHub Actions workflow.

Deployment

Web App (Vercel)

  1. Connect your GitHub repository to Vercel
  2. Set the root directory to web/
  3. Add DATABASE_URL to your Vercel environment variables
  4. Deploy

Worker

The worker can be deployed as:

  • A standalone Node.js process on any server
  • A scheduled GitHub Actions workflow

Automated weekly database backup (Supabase)

This repository includes a GitHub Actions workflow at .github/workflows/supabase-weekly-backup.yml that:

  • runs every Sunday at 02:00 UTC,
  • dumps Supabase/PostgreSQL data only to backups/supabase/data-YYYY-MM-DD.sql.gz,
  • keeps only the latest 6 weekly backups,
  • commits and pushes backup changes automatically.

Required GitHub secret:

  • DATABASE_URL: your Supabase Postgres connection string (the same one used by the app/worker).

You can also trigger the backup manually from the Actions tab using workflow_dispatch.

Data Source

Snow condition data is sourced from Nordic France — the official portal for Nordic skiing in France.

License

This project is licensed under the GNU General Public License v3.0.

Languages

TypeScript98.6%CSS0.7%JavaScript0.7%

Contributors

GNU General Public License v3.0
Created February 9, 2026
Updated March 9, 2026