Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
59 commits
Select commit Hold shift + click to select a range
f602bf1
Add HITL finetuning test data generation infrastructure
davidackerman Feb 9, 2026
fd79e46
Add LoRA wrapper and CorrectionDataset for HITL finetuning
davidackerman Feb 9, 2026
7069780
Add LoRA training loop and CLI for HITL finetuning
davidackerman Feb 9, 2026
a2b7a92
Fix LoRA wrapper for Sequential models and add comprehensive document…
davidackerman Feb 9, 2026
dee0d52
Complete HITL finetuning pipeline with full component validation
davidackerman Feb 9, 2026
5b1def4
Fix zarr structure for Neuroglancer/OME-NGFF compatibility
davidackerman Feb 9, 2026
598d494
Fix dataset to handle mismatched raw/mask sizes
davidackerman Feb 9, 2026
f2e7667
Disable patching to use full-size corrections for training
davidackerman Feb 9, 2026
1e1e89c
Fix CLI to handle None patch_shape
davidackerman Feb 9, 2026
6af57d8
Disable spatial augmentation for mismatched input/output sizes
davidackerman Feb 9, 2026
d6d59f8
Add mito correction generator with fly_organelles model
davidackerman Feb 11, 2026
fd778bb
Add channel selection and logging for LoRA finetuning
davidackerman Feb 11, 2026
f27f3be
Add diagnostic tools for analyzing finetuning quality
davidackerman Feb 11, 2026
6b4e9e8
Document LoRA finetuning workflow with detailed walkthrough
davidackerman Feb 11, 2026
d879a6a
Fix normalization pipeline and add sparse annotation support
davidackerman Feb 11, 2026
e2a2b87
Add utility scripts for sparse annotation workflow
davidackerman Feb 11, 2026
c347bae
Add MinIO hosting scripts for Neuroglancer annotations
davidackerman Feb 12, 2026
19e4285
Remove unnecessary HTTP and legacy MinIO scripts
davidackerman Feb 12, 2026
343abbe
Add finetune annotation crop viewer integration
davidackerman Feb 12, 2026
e46e0b9
Add bidirectional MinIO annotation syncing and improve finetuning wor…
davidackerman Feb 12, 2026
0c1876e
Update finetuning documentation with dashboard workflow
davidackerman Feb 12, 2026
82f704d
Improve finetuning: fix gradient accumulation bug, add live log strea…
davidackerman Feb 13, 2026
dfc3380
Add finetuning job management system and dashboard integration
davidackerman Feb 13, 2026
8aee588
Add auto-serve inference after finetuning and iterative training on s…
davidackerman Feb 14, 2026
a8505e1
Add auto-serve status display and restart training UI to finetune tab
davidackerman Feb 14, 2026
dc1713e
Fix dark mode styling for modals, form controls, labels, and muted text
davidackerman Feb 14, 2026
48689c1
Fix chunk boundary bug in log marker detection for neuroglancer layer…
davidackerman Feb 14, 2026
1c4323e
Filter noisy debug and werkzeug lines from training log stream
davidackerman Feb 14, 2026
c268747
Fix restart layer update: run iteration check every monitor cycle
davidackerman Feb 14, 2026
923ef9e
Fix training collapse and add MSE loss with label smoothing
davidackerman Feb 14, 2026
0ec9965
Clamp dataloader batch size to dataset size
davidackerman Feb 14, 2026
2fa852e
Add margin loss, teacher distillation, and expanded training UI controls
davidackerman Feb 14, 2026
2d89327
Add class balancing option to prevent foreground overprediction
davidackerman Feb 14, 2026
c828a75
Fix duplicate log lines by removing redundant file write
davidackerman Feb 14, 2026
510b4df
Fix MinIO startup failure when console port is already in use
davidackerman Feb 15, 2026
076ee62
Add GPU queue selection to finetuning UI
davidackerman Feb 15, 2026
4fa99c5
Fix delayed log streaming by disabling pipe buffering
davidackerman Feb 15, 2026
f7cc402
Save only LoRA weights in checkpoints instead of full model
davidackerman Feb 15, 2026
e4909ab
Improve restart control path and CLI logging behavior
davidackerman Feb 15, 2026
9ef0be7
Speed up finetune dashboard log streaming and restart UX
davidackerman Feb 15, 2026
f9e19d1
Add model load timing logs and fs visibility probe tool
davidackerman Feb 15, 2026
af06fff
Update finetune progress UI from live SSE epoch/loss logs
davidackerman Feb 15, 2026
0801574
Update finetune UI, dataset, and job manager behavior
davidackerman Feb 18, 2026
d3d7e9b
Remove dev scripts, docs artifacts, and output binaries
davidackerman Feb 18, 2026
6aa5f25
Revert non-finetune changes to match main
davidackerman Feb 18, 2026
7c3bf59
Refactor finetune module and restore blueprint architecture
davidackerman Feb 18, 2026
8b8e2dd
Fix split_dataset_path for paths with nested .zarr segments
davidackerman Feb 18, 2026
b5ee861
Deduplicate sync logic, job status, and dataset path extraction
davidackerman Feb 18, 2026
c90150b
Merge branch 'main' into finetuning_refactor
davidackerman Feb 25, 2026
d46f9ea
Add target transform classes for converting annotations to training t…
davidackerman Feb 26, 2026
26da1ba
Support target_transform in LoRAFinetuner
davidackerman Feb 26, 2026
216fd2e
Add --output-type, --select-channel, --offsets CLI args and target tr…
davidackerman Feb 26, 2026
d82fdd9
Wire output_type, select_channel, and offsets through dashboard and j…
davidackerman Feb 26, 2026
98bd456
Fix formatting in model_spec_affinities example
davidackerman Feb 26, 2026
37fa61a
Add finetuning guide documentation
davidackerman Feb 26, 2026
5cafc67
Remove normalize parameter from CorrectionDataset and create_dataloader
davidackerman Feb 26, 2026
abfbb35
Move dashboard state from state.py into Flow singleton in globals.py
davidackerman Feb 26, 2026
e49acfa
Add finetuning screenshots and image references to guide
davidackerman Feb 26, 2026
3d1f784
Move test_target_transforms.py to tests/finetune/
davidackerman Feb 26, 2026
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
11 changes: 10 additions & 1 deletion .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -162,4 +162,13 @@ cython_debug/
#.idea/

# Misc
.vscode/
.vscode/

# Project-specific
ignore/
*.zarr/
.claude/
test_corrections.zarr/
correction_slices/
corrections/
output/
10 changes: 7 additions & 3 deletions cellmap_flow/cli/server_cli.py
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,6 @@
from cellmap_flow.utils.plugin_manager import load_plugins


logging.basicConfig()
logger = logging.getLogger(__name__)


Expand All @@ -47,7 +46,7 @@ def cli(log_level):
cellmap_flow_server script -s /path/to/script.py -d /path/to/data
cellmap_flow_server cellmap-model -f /path/to/model -n mymodel -d /path/to/data
"""
logging.basicConfig(level=getattr(logging, log_level.upper()))
logging.basicConfig(level=getattr(logging, log_level.upper()), force=True)


@cli.command(name="list-models")
Expand Down Expand Up @@ -82,6 +81,9 @@ def create_dynamic_server_command(cli_name: str, config_class: Type[ModelConfig]
except:
type_hints = {}

# Track used short names to avoid collisions with common options.
used_short_names = {"-d", "-p"}

# Create the command function
def command_func(**kwargs):
# Separate model config kwargs from server kwargs
Expand Down Expand Up @@ -141,7 +143,9 @@ def command_func(**kwargs):

# Add model-specific options based on constructor parameters
for param_name, param_info in reversed(list(sig.parameters.items())):
option_config = create_click_option_from_param(param_name, param_info)
option_config = create_click_option_from_param(
param_name, param_info, used_short_names
)
if option_config:
command_func = click.option(
*option_config.pop("param_decls"), **option_config
Expand Down
77 changes: 77 additions & 0 deletions cellmap_flow/cli/viewer_cli.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,77 @@
"""
Simple CLI for viewing datasets with CellMap Flow without requiring model configs.
"""

import click
import logging
import neuroglancer
from cellmap_flow.dashboard.app import create_and_run_app
from cellmap_flow.globals import g
from cellmap_flow.utils.scale_pyramid import get_raw_layer

logging.basicConfig()
logger = logging.getLogger(__name__)


@click.command()
@click.option(
"-d",
"--dataset",
required=True,
type=str,
help="Path to the dataset (zarr or n5)",
)
@click.option(
"--log-level",
type=click.Choice(
["DEBUG", "INFO", "WARNING", "ERROR", "CRITICAL"], case_sensitive=False
),
default="INFO",
help="Set the logging level",
)
def main(dataset, log_level):
"""
Start CellMap Flow viewer with a dataset.

Example:
cellmap_flow_viewer -d /path/to/dataset.zarr
"""
logging.basicConfig(level=getattr(logging, log_level.upper()))

logger.info(f"Starting CellMap Flow viewer with dataset: {dataset}")

# Set up neuroglancer server
neuroglancer.set_server_bind_address("0.0.0.0")

# Create viewer
viewer = neuroglancer.Viewer()

# Set dataset path in globals
g.dataset_path = dataset
g.viewer = viewer

# Add dataset layer to viewer
with viewer.txn() as s:
# Set coordinate space
s.dimensions = neuroglancer.CoordinateSpace(
names=["z", "y", "x"],
units="nm",
scales=[8, 8, 8],
)

# Add data layer
s.layers["data"] = get_raw_layer(dataset)

# Print viewer URL
logger.info(f"Neuroglancer viewer URL: {viewer}")
print(f"\n{'='*80}")
print(f"Neuroglancer viewer: {viewer}")
print(f"Dataset: {dataset}")
print(f"{'='*80}\n")

# Start the dashboard app
create_and_run_app(neuroglancer_url=str(viewer), inference_servers=None)


if __name__ == "__main__":
main()
9 changes: 5 additions & 4 deletions cellmap_flow/dashboard/app.py
Original file line number Diff line number Diff line change
Expand Up @@ -5,15 +5,15 @@
from flask import Flask
from flask_cors import CORS

from cellmap_flow.dashboard import state
from cellmap_flow.dashboard.state import LogHandler
from cellmap_flow.globals import g, LogHandler
from cellmap_flow.dashboard.routes.logging_routes import logging_bp
from cellmap_flow.dashboard.routes.index_page import index_bp
from cellmap_flow.dashboard.routes.pipeline_builder_page import pipeline_builder_bp
from cellmap_flow.dashboard.routes.models import models_bp
from cellmap_flow.dashboard.routes.pipeline import pipeline_bp
from cellmap_flow.dashboard.routes.blockwise import blockwise_bp
from cellmap_flow.dashboard.routes.bbx_generator import bbx_bp
from cellmap_flow.dashboard.routes.finetune_routes import finetune_bp

logger = logging.getLogger(__name__)

Expand All @@ -37,11 +37,12 @@
app.register_blueprint(pipeline_bp)
app.register_blueprint(blockwise_bp)
app.register_blueprint(bbx_bp)
app.register_blueprint(finetune_bp)


def create_and_run_app(neuroglancer_url=None, inference_servers=None):
state.NEUROGLANCER_URL = neuroglancer_url
state.INFERENCE_SERVER = inference_servers
g.NEUROGLANCER_URL = neuroglancer_url
g.INFERENCE_SERVER = inference_servers
hostname = socket.gethostname()
port = 0
logger.warning(f"Host name: {hostname}")
Expand Down
Loading