Skip to content

andyed/scrutinizer2025

Repository files navigation

Scrutinizer — Foveated Vision Simulator

Electron WebGL License

Live site: scrutinizer.app | Blog | YouTube

macOS Installer: Download v2.1.0 | Changelog


What Scrutinizer Does

Your eyes only see fine detail right where you're looking — everything else is blurry, color-shifted, and crowded. Scrutinizer simulates this, rendering any web page through a model of how human vision actually works, bound to your mouse cursor. Move the cursor and watch the rest of the page degrade the way your peripheral vision does. The question it answers: what can a user actually see at a glance, before their eyes move?

Tip

For usability practitioners: Scrutinizer works as a Restricted Focus Viewer — evaluate peripheral discoverability, color reliance, and layout hierarchy without eye tracking hardware.

Dashboard with foveated rendering

A dashboard viewed through Scrutinizer. Cursor at center — detail and color fade with distance from fixation, and dense regions (text, grids) degrade more than isolated elements.

Congestion Overlay Chromatic Pooling Article Page
Congestion Chromatic Article
Visual clutter heatmap Peripheral color shift Blog article with simulation

An Experiment in AI-Assisted Vision Science

Scrutinizer is built with AI coding tools (Claude Code and Gemini) as research partners — AI synthesizes literature and drafts implementations; the human evaluates scientific defensibility.

The v2.1 psychophysical validation is a case study: in a single day, AI and human together digitized data from papers spanning 1970–2025 (Rovamo 1979, Hansen 2009, Mullen & Kingdom 2002, Bowers 2025), built stimulus pages recreating the original experiments, ran the full validation battery, and found three shader bugs that had survived months of visual testing. All published data, stimuli, and analysis scripts ship with the repo.


Model Architecture

The rendering pipeline mirrors how the brain's visual pathway actually works — three processing stages, each doing something different to the image as it moves from eye to cortex. Full details in the Biological Model.

Stage What it does How Scrutinizer simulates it
LGN (relay) Decides what gets through — suppresses blank areas, boosts important regions Structure map (DOM analysis), saliency modulation
V1 (detail) Processes edges and spatial detail — resolution drops with distance from fixation, nearby elements crowd each other 8 half-octave DoG bands, density-gated crowding
V4 (color) Handles color and object-level grouping — red-green fades before blue-yellow in periphery Per-channel chromatic decay, coupled spatial pooling

Resolution falloff across all stages follows a cortical magnification function — a log-mapping that describes how the brain allocates disproportionate processing power to the center of gaze.

DOM-aware rendering. Scrutinizer reads the live DOM — grouping adjacent text nodes into paragraph clusters (Gestalt proximity), measuring local density from the node tree, and feeding that into the V1 crowding gate. A dense text column and an isolated heading at the same eccentricity get different treatment, because Rosenholtz's pooling regions compute different summary statistics over them.

Feature Congestion scoring runs alongside the pipeline, measuring visual clutter (color variance, edge density, contrast) to produce a 0–100 complexity score per region. See congestion-journey.md.

Calibration. A Motion Silence staircase anchors the simulation to the user's actual perceptual foveal extent.


Features

Rendering Pipeline (v2.1)

  • 8 half-octave DoG bands — Difference-of-Gaussians peripheral reconstruction at √2 frequency spacing (5.66–0.5 cpd), validated against Rovamo & Virsu 1979
  • Foveal/peripheral simulation — eccentricity-dependent spatial pooling and chromatic filtering bound to cursor position
  • Analytical cortical magnification — eccentricity falloff using the Schwartz (1980) log-mapping parameterization (mode 6), alongside legacy (mode 7) for comparison
  • Feature Congestion pipeline — real-time visual clutter scoring with ComplexityHUD overlay (Score / Stats / Spatial tabs)
  • Congestion-gated pooling (mode 9) — peripheral attenuation weighted by local visual complexity
  • Saliency modulation — allocates more peripheral bandwidth to salient regions (edges, contrast, high-importance areas)
  • Structure map analysis — reads the live DOM to detect text rhythm, element density, font weight, and semantic type (ARIA roles), feeding the crowding and saliency stages
  • Visual memory simulation — iconic memory decay across 5 modes (Off, Limited, Extended, Infinite, Fixation Buffer)

Tools

  • Foveal Calibratoronline tool measuring perceptual foveal spread via Motion Silence psychophysics
  • scrutinizer-audit CLI — headless Playwright-based site auditor: Feature Congestion scoring, batch URL evaluation, sitemap crawling, CI gating (--fail-above N), heatmap export
  • MCP server — AI-assisted design review via analyze_url, analyze_urls, compare_pages tools for Claude Code integration
  • Golden capture pipeline — automated screenshot capture and SSIM/PSNR regression testing across versions

Interface

  • Extensibility modes — modular shader pipeline supports custom visual effects (Frosted Glass, Wireframe, Minecraft, Double Vision are included as test cases; see Developer's Guide)
  • Simulation menu — organized into Behavior (cognitive), Foveal (spatial), Peripheral (rendering), and Utility (debug) groups
  • Eccentricity overlay — boundary ring visualization for foveal/parafoveal/peripheral zones

Platform

  • macOS: Signed and notarized (v1.3+), Apple Silicon native
  • Figma plugin: Scrutinizer Pro — free with watermark, uses Figma DOM for prototype support

Validation & Reproducibility

Scrutinizer validates each pipeline stage against published psychophysical data spanning 45 years of vision science.

Psychophysical validation (v2.1)

Five waves test the shader against published human data. The pattern: render a known stimulus, measure output pixels at each eccentricity, compare against the original paper's measurements. Published data is digitized into machine-readable JSON in tests/validation/published-data/.

Wave Domain Published basis Key result
1 Chromatic decay Hansen 2009, Mullen & Kingdom 2002 RG/YV channel separation matches opponent-channel predictions
2 Spatial frequency Rovamo & Virsu 1979 Frequency-selective attenuation (not uniform blur), r=0.600 composite
3 Crowding geometry Bouma 1970, Toet & Levi 1992 R:T bug found and fixed; density gate validated at 3.3:1
4 Saliency protection Itti & Koch 2001, Hershler 2005 Face saliency 4.79× control; protection ratio 0.283
5 Mixed-density UI Halverson & Hornof 2011 Density gate predicts same sparse/dense pattern as EPIC model

15 HTML reference pages ship as open-source psychophysical stimuli. Each validation wave has a capture script (Electron headless) and an analysis script (pixel measurement). Blog post: Measuring the Pipeline.

node scripts/capture-crowding.js        # Capture crowding stimuli through pipeline
node scripts/analyze-dog-bands.js       # Band weight analysis (pure math, no GPU)

Regression testing

Golden captures. Automated screenshots at fixed viewport/URL/mode combinations, compared across versions using SSIM (≥0.98) and PSNR (≥35 dB) thresholds. The capture pipeline runs headlessly and produces paired comparison images stored in docs/golden/.

npm run capture-golden          # Generate reference captures
npm run golden-compare          # Compare current output against references

Feature Congestion validation. The JavaScript implementation is cross-validated against the Python reference (Rosenholtz lab toolbox) on matched test images. Spearman rank correlation ρ=0.93.

npm run validate:python         # Run Python reference (requires uv + Python 3.12)
npm run validate:scrutinizer    # Run Scrutinizer's JS implementation

Methodology note. Following the cross-validation approach advocated by Bowers et al. (2025), each pipeline stage is tested against its reference independently before integration. The simulation does not claim biological accuracy — it claims fidelity to the cited models, which are themselves approximations.


Calibration

Default: fovea_deg = 2.0, foveaRadius = 90px (45 px/°) — within 2% on reference hardware (MBP Retina @ 50cm). At different viewing distances the fixed mapping diverges (±30–40%). The Foveal Calibrator measures perceptual foveal extent via Motion Silence staircase but doesn't yet separate px_per_deg from comfort radius. Fix path: Project 1.3.


Research Opportunities

Seventeen graduate-level projects are specified in grad-student-projects.md — vision science, HCI, design tools, and systems work, each with effort level, novelty, and IRB requirements.

Key open specs: oriented DoG bands (1.1), texture synthesis (1.2), calibrated visual angles (1.3), saccadic dynamics (1.4), eye tracker integration (3.3). Contributions welcome — see the Developer's Guide.


Known Limitations

  1. Calibration portability — default mapping is accurate on reference hardware (MBP Retina @ 50cm); diverges at other viewing distances. Fix: Project 1.3
  2. Approximate spatial pooling — uses averaged pixel blocks, not the texture-like statistical summaries the brain preserves in peripheral vision. Fix: Project 1.2
  3. Sequential color pipeline — spatial averaging runs before color attenuation, slightly over-degrading mid-peripheral color. Fix: ROADMAP
  4. No memory across fixations — each fixation renders independently; the brain accumulates information across eye movements. Visual Memory modes approximate this. See: simulation-limitations.md
  5. Mouse, not eyes — cursor tracking (~200ms latency) approximates but doesn't replicate gaze fixation. Fix: Project 3.3

Full gap analysis: simulation-limitations.md.


Installation

Download (v2.1.0)

Scrutinizer for macOS is Signed & Notarized — no security warnings.

View All Releases & Changelogs

Troubleshooting macOS Warnings (Manual/Unsigned Builds Only)

The official release v1.3.0+ is signed and notarized. These steps only apply to source builds or older versions.

  1. Right-click Scrutinizer.appOpen.
  2. Click Open when warned about the unidentified developer.
  3. If blocked, go to System Settings → Privacy & Security and click Open Anyway.
  4. Advanced: xattr -dr com.apple.quarantine /Applications/Scrutinizer.app.
Troubleshooting Windows SmartScreen
  1. Run the installer.
  2. If SmartScreen appears, click More infoRun anyway.

Developer Setup

npm install
npm start                       # Development mode
npm run build                   # Signed DMG (macOS)
npm test                        # Run test suite

CLI Setup

# scrutinizer-audit — headless visual complexity auditor
node cli/scrutinizer-audit.js https://example.com
node cli/scrutinizer-audit.js --sitemap https://example.com/sitemap.xml --fail-above 70

# MCP server — AI-assisted design review
claude mcp add scrutinizer-audit -- node cli/mcp/server.js

Usage & Controls

Basic Navigation

  1. Navigate — use the toolbar URL bar to enter URLs or search terms
  2. Toggle simulation — click the eye icon, press Cmd+Shift+F, or use Simulation → Foveal → Toggle
  3. Adjust radius — Left/Right arrow keys, or use Simulation → Foveal → Radius
  4. Calibrate — use the Foveal Calibrator to measure your actual foveal spread

Menu Structure

Menu Group Contents
Behavior Visual Memory (5 modes), Structure Map, Saliency Modulation
Foveal Toggle, Radius (6 sizes), Shape (4 aspect ratios)
Peripheral Intensity (5 levels), Effect Type, Chromatic Aberration
Utility Rendering modes, Structure Map view, Saliency Map view, Eccentricity Overlay

Keyboard Shortcuts

Key Action
Cmd+Shift+F Toggle foveal simulation
Right Arrow Increase foveal radius
Left Arrow Decrease foveal radius
Cmd+L Focus URL bar

Documentation


Acknowledgments

  • face-api.js (v1.7.15, Vladimir Mandic) — TinyFaceDetector powers the face channel in the saliency pipeline. MIT license.
  • Rosenholtz Lab — Feature Congestion metric, Texture Tiling Model, and peripheral vision research that grounds this project.
  • FOVI (Blauch, Alvarez & Konkle) — cortical magnification parameterization adopted in v1.7.
  • castleCSF (Ashraf et al.) — per-channel chromatic contrast sensitivity functions.
  • arXiv — open preprint infrastructure. Multiple foundational papers (FOVI, castleCSF) were accessible because researchers posted preprints.

License

Copyright (c) 2012–2026, Andy Edmonds. All rights reserved. Licensed under the MIT License.

About

foveal/peripheral vision simulator in a web browser

Topics

Resources

License

Contributing

Stars

Watchers

Forks

Packages

 
 
 

Contributors