Real-time dashboard for Southern California coastline with tides, conditions, wildlife, live cams, etc
Pacifica pulls data from dozens of sources—NOAA APIs, dive reports, whale watch logs, live cams, citizen science platforms—into a single tiled interface. Know what's happening on the coast right now and what's coming up.
Built for the curious, active Southern Californian who wants actionable intel (not just raw data) on where to dive, snorkel, whale watch, or explore tidepools.
| Layer | Technology |
|---|---|
| Frontend | React 18, Vite, TypeScript, D3.js |
| Backend | Python, FastAPI, WebSockets |
| Database | PostgreSQL 16 + TimescaleDB |
| Scrapers | Python, httpx, BeautifulSoup, APScheduler |
| Testing | Vitest (frontend), pytest (backend), Playwright |
| Infrastructure | Docker Compose, GitHub Actions |
# Clone and enter repo
git clone https://github.com/pandeiro/pacifica.git
cd pacifica
# Start services (Postgres + API)
make up
# Run migrations and seed data
make migrate
make seed
# In another terminal, start the frontend
make devAccess:
- Dashboard: http://localhost:5173
- API: http://localhost:4900
- API Health: http://localhost:4900/api/health
- Docker and Docker Compose
- Node.js 18+ (for frontend dev server)
- Python 3.11+ (for running scrapers locally)
# Start/stop services
make up # Start Docker services (Postgres + API)
make up-all # Start all services including scraper
make down # Stop all services
# Database
make migrate # Run database migrations
make seed # Seed initial data
# Frontend (requires `make up` first)
make dev # Start Vite dev server
make build # Build for production
make test # Run Vitest tests
make lint # Run ESLint
make typecheck # Run TypeScript check
# Logs
make logs # Tail all service logspacifica/
├── api/ # FastAPI application
├── scraper/ # Data collection scrapers
├── frontend/ # React/Vite dashboard
├── db/ # Migrations and seed data
├── tools/ # Development utilities
└── doc/ # Documentation (specs, references)
Pacifica aggregates data from multiple sources via automated scrapers:
| Source | Data | Frequency |
|---|---|---|
| NOAA CO-OPS | Tide predictions, water levels | Every 6 hours |
| NOAA CO-OPS | Water temperature (hourly averages) | Every 6 hours |
| sunrise-sunset.org | Sunrise, sunset, golden hour | Daily at 2:30 AM |
Scrapers run via GitHub Actions (see .github/workflows/scrapers.yml) and write directly to the TimescaleDB database. Each scraper is designed to be polite—respecting rate limits and using incremental backoff when needed.
MIT