Real-time monitoring dashboard for the Self-Powered Arctic Ocean (SPAO) buoy system, developed at Pacific Northwest National Laboratory under the DOE Water Power Technologies Office.
Live Demo: https://spao-buoy-dashboard.streamlit.app/
| Page | What it does |
|---|---|
| Overview | KPI cards (active buoys, last contact, data quality), mini map, activity feed, trajectory map |
| Live Telemetry | Real-time data table with battery/CRC badges, date filtering, inline notes editing, CSV export |
| Packet Decoder | Hex packet decoder — single input with CRC card and GPS mini-map, or batch CSV upload |
| Archive | Multi-device data browser with KPI summary statistics |
| Analytics | Drift trajectory maps with detail panel, time-series sensor plots, custom 3D scatter |
Six telemetry packet versions are auto-detected by byte length:
| Version | Bytes | Key Difference |
|---|---|---|
| FY25 | 38 | Bering Sea deployment, supercapacitor fields |
| FY26 (v3) | 37 | Simplified previous-session fields |
| FY26 (v5) | 43 | Extended sensor set |
| FY26 (v5) + EC | 47 | Adds electrical conductivity & salinity |
| FY26 (v6.4) | 45 | Adds Prev Oper Time, TENG scale change |
| FY26 (v6.4) + EC | 49 | v6.4 + conductivity & salinity |
git clone https://github.com/Denny-Hwang/spao-buoy-dashboard.git
cd spao-buoy-dashboard
pip install -r requirements.txtCreate .streamlit/secrets.toml with your GCP service account key:
[gcp_service_account]
type = "service_account"
project_id = "your-project-id"
private_key_id = "key-id"
private_key = "-----BEGIN PRIVATE KEY-----\n...\n-----END PRIVATE KEY-----\n"
client_email = "sa@project.iam.gserviceaccount.com"
client_id = "123456789"
auth_uri = "https://accounts.google.com/o/oauth2/auth"
token_uri = "https://oauth2.googleapis.com/token"
auth_provider_x509_cert_url = "https://www.googleapis.com/oauth2/v1/certs"
client_x509_cert_url = "https://www.googleapis.com/robot/v1/metadata/x509/..."Prerequisites: A GCP service account with the Google Sheets API enabled, and the target spreadsheet shared with the service account email as Editor. See docs/setup-guide.md for step-by-step instructions.
streamlit run app.pyThe app opens at http://localhost:8501. Select a page from the sidebar to start.
┌─────────────────────────────────────────────────────────┐
│ Streamlit App │
│ │
│ app.py (Home) │
│ pages/ │
│ ├─ Overview ── KPI cards, mini map, activity │
│ ├─ Live Telemetry── data table, battery/CRC badges │
│ ├─ Packet Decoder── hex decoder, GPS mini-map │
│ ├─ Archive ── past deployment browser │
│ └─ Analytics ── drift maps, sensor plots │
│ │
│ utils/ │
│ ├─ theme.py ── PNNL brand colors, UI helpers │
│ ├─ sheets_client.py ── Google Sheets read/write │
│ ├─ decoders.py ── packet decode (6 versions) │
│ ├─ map_utils.py ── Folium map builder │
│ └─ plot_utils.py ── Plotly chart styling │
└──────────────────┬──────────────────────────────────────┘
│ Google Sheets API
▼
┌──────────────────────────────────┐
│ Google Sheets │
│ (per-IMEI tabs, auto-decoded) │
└──────────────────┬───────────────┘
▲
│ RockBLOCK webhook POST
┌──────────────────┴───────────────┐
│ Google Apps Script Webhook │
│ (apps_script/Code.gs) │
│ — receives satellite data │
│ — decodes & appends to sheet │
└──────────────────────────────────┘
Key points:
- No backend server or database — Streamlit handles both UI and logic; Google Sheets is the sole data store.
- Data flows in two ways: the Apps Script webhook writes decoded telemetry into Google Sheets, and the Streamlit app reads it for display and analysis.
- Packet decoding is duplicated in both the webhook (Apps Script) and the dashboard (Python) so data can be ingested from either raw or pre-decoded sources.
- Caching — Streamlit's
@st.cache_datawith TTLs (60–300s) minimizes API calls. - PNNL Branding — Consistent color system based on PNNL brand standards with accessible color contrast.
For detailed architecture documentation, see the docs/ folder.
Phase 2 extends the operational dashboard with cron-enriched scientific analysis pages. Phase 1 pages (1–6) remain untouched and the new pages are hidden behind the sidebar toggle "Show Phase 2 enriched columns" (default off), so the operational view is unchanged.
Phase 2 pages (skeleton — full implementation lands in follow-up PRs):
7_📖_Phase2_Overview.py— reference page: data sources, cron pipeline, visualization legend8_🔋_TENG_Performance.py— harvested energy vs. sea state9_🌊_SST_Validation.py— buoy SST vs. OISST / MUR / OSTIA / ERA5 / Open-Meteo (coastal/inland)10_🧭_Drift_Dynamics.py— wind- and current-driven drift decomposition11_📡_Data_Enrichment.py— enrichment coverage and status
Enrichment pipeline. GitHub Actions workflows run on a schedule and write enriched columns back into the same Google Sheet, so the Streamlit app never calls external APIs at request time:
.github/workflows/enrichment_hourly.yml— Open-Meteo Marine / Historical (every hour at :15).github/workflows/enrichment_daily.yml— NOAA OISST, MUR, OSTIA, OSCAR, OSI SAF sea ice (07:30 UTC)
Required GitHub Actions secrets:
GCP_SERVICE_ACCOUNT_JSON, GOOGLE_SHEETS_ID,
COPERNICUS_USERNAME, COPERNICUS_PASSWORD.
Page 10 (📡 Data Enrichment) can dispatch the daily enrichment workflow
from the Streamlit UI. To enable the "Trigger backfill" button, provide
GH_DISPATCH_TOKEN either as a Streamlit secret or an
environment variable:
- Create a fine-grained personal access token at https://github.com/settings/tokens?type=beta.
- Scope it to this repository only (
Denny-Hwang/spao-buoy-dashboard). - Grant Actions: Read and write repository permission (this is the
minimum needed for
workflow_dispatch). - Provide the token via one of these methods:
or
# Option A: Streamlit secret GH_DISPATCH_TOKEN = "github_pat_...."
# Option B: environment variable export GH_DISPATCH_TOKEN="github_pat_...."
- Restart Streamlit. The button on page 10 becomes active.
Without the secret, the page renders an informational message and a disabled button — the scheduled cron still runs normally.
Streamlit secrets are unchanged for Phase 1 — only gcp_service_account is used by
the running app.
See CLAUDE.md for the project-wide Phase 2 context.
- Push your repo to GitHub
- Go to streamlit.io/cloud and connect the repo
- Add the GCP service account JSON under Settings > Secrets
- Done — the app auto-deploys on every push to
main
pip install -r requirements.txt
streamlit run app.py --server.port 8501| Document | Description |
|---|---|
| docs/setup-guide.md | Full setup — GCP credentials, Google Sheets, Apps Script webhook |
| docs/architecture.md | Detailed architecture, data flow, and module reference |
| docs/packet-format.md | Telemetry packet structure for all hardware versions |
| docs/google-sheets-format.md | Spreadsheet structure, supported data formats, tab naming |
This project is licensed under the GPL-3.0 License — see the LICENSE file for details.
- Pacific Northwest National Laboratory
- DOE Water Power Technologies Office
- NOAA PMEL (Bering Sea deployment support)
Sungjoo Hwang — sungjoo.hwang@pnnl.gov Pacific Northwest National Laboratory