Add bulk soil analysis PDF upload with field matching#449
Conversation
… them in a table and connect to a field
🦋 Changeset detectedLatest commit: f2a42fd The changes in this PR will be included in the next version bump. This PR includes changesets to release 1 package
Not sure what this means? Click here to learn what changesets are. Click here if you're a maintainer who wants to add another changeset to this PR |
|
Note Reviews pausedIt looks like this branch is under active development. To avoid overwhelming you with review comments due to an influx of new commits, CodeRabbit has automatically paused this review. You can configure this behavior by changing the Use the following commands to manage reviews:
Use the checkboxes below for quick actions:
WalkthroughAdds a two-step bulk soil analysis upload and review flow: multi-PDF Dropzone upload, server-side bulk PDF validation and extraction, automatic field matching (geometry/name), review table to map analyses to fields, and persistence endpoints; plus dashboard/sidebar entry points and changeset entries. Changes
Sequence DiagramsequenceDiagram
participant User
participant UI as "Bulk Upload UI"
participant Route as "Bulk Upload Route"
participant NMI as "NMI Integration"
participant DB as "Database"
User->>UI: Drag & drop multiple PDFs
UI->>UI: Validate & stage files (client, 5MB/file)
User->>UI: Submit files
UI->>Route: POST FormData (files)
Route->>NMI: extractBulkSoilAnalyses(formData)
NMI->>NMI: Validate PDFs, extract analyses per file
NMI-->>Route: Return processed analyses array
Route-->>UI: Respond with analyses data
UI->>UI: Auto-match analyses to fields (geometry/name)
User->>UI: Inspect/adjust matches in Review UI
UI->>Route: POST matches + analysesData (save)
Route->>DB: addSoilAnalysis(...) per matched item
DB-->>Route: Persisted
Route-->>User: Redirect / success
Estimated code review effort🎯 4 (Complex) | ⏱️ ~45 minutes Possibly related PRs
Suggested reviewers
Poem
🚥 Pre-merge checks | ✅ 4 | ❌ 1❌ Failed checks (1 warning)
✅ Passed checks (4 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing touches🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Actionable comments posted: 8
🤖 Fix all issues with AI agents
In `@fdm-app/app/components/blocks/soil/bulk-upload-form.tsx`:
- Around line 74-103: The Dropzone currently advertises "max 5MB per bestand"
but doesn't enforce it; add a maxSize prop to the Dropzone component (alongside
the existing name="soilAnalysisFiles", accept, multiple, value={files},
onFilesChange={handleFilesChange}) set to 5 * 1024 * 1024 and update
handleFilesChange to handle the Dropzone's file-rejection behavior so users see
immediate errors; also ensure the same 5MB limit is enforced in the
corresponding client-side Zod schema (e.g., form-upload.tsx) and the server
upload handler so UI and server validation match.
- Around line 53-59: The helper function formatFileSize uses the global
parseFloat which violates the Biome useNumberNamespace rule; update
formatFileSize to call Number.parseFloat instead of parseFloat (leave all other
logic intact) so the function uses Number.parseFloat((bytes / Math.pow(k,
i)).toFixed(2)) + " " + sizes[i]; ensure the symbol name formatFileSize is
updated in-place and no other behavior changes.
- Around line 38-44: The useEffect currently uses isUploading but doesn't
include it in the dependency array; update the effect declared with
useEffect(...) so its dependency list includes isUploading and onSuccess and
fetcher.data (remove fetcher.state since it's only used to derive isUploading)
to satisfy exhaustive-deps and ensure the effect reruns correctly when upload
state changes.
In `@fdm-app/app/components/blocks/soil/bulk-upload-review.tsx`:
- Around line 99-101: The Badge rendering currently checks row.original.a_p_al
but displays row.original.a_p_cc, causing "P-CaCl2: undefined" when a_p_cc is
missing; change the conditional to check for row.original.a_p_cc (or a
null/undefined-safe check like != null) before rendering the Badge so the
displayed value always exists; update the JSX where the Badge is created (the
Badge component line in bulk-upload-review.tsx that references a_p_al and
a_p_cc) to use row.original.a_p_cc in the condition.
In `@fdm-app/app/integrations/nmi.ts`:
- Around line 313-319: The bulk-path code handling field.b_depth assigns
Number(...) directly and can store NaN; update the logic around field.b_depth so
it first validates the "X-Y" format and that both parsed parts are finite
numbers (e.g., use isFinite after Number conversion or Number.isFinite) and only
then set soilAnalysis.a_depth_upper and soilAnalysis.a_depth_lower; if
validation fails, do not assign those properties (or set them to undefined/null)
to match the single-file flow. Ensure you touch the block referencing
field.b_depth and the soilAnalysis.a_depth_upper / soilAnalysis.a_depth_lower
assignments so invalid input cannot be persisted.
In `@fdm-app/app/routes/farm`.$b_id_farm._index.tsx:
- Around line 399-412: The JSX uses an unnecessary template literal for the
NavLink target; update the NavLink element (the JSX using NavLink with
to={`soil-analysis/bulk`}) to use a plain string literal instead
(to="soil-analysis/bulk") to satisfy the noUnusedTemplateLiteral rule and avoid
unneeded interpolation.
In `@fdm-app/app/routes/farm`.$b_id_farm.soil-analysis.bulk.tsx:
- Around line 75-95: The code maps over matches and calls addSoilAnalysis using
Number(analysis.a_depth_lower) / Number(analysis.a_depth_upper) which yields NaN
for missing/non-numeric depths and bypasses addSoilAnalysis validation; before
invoking Promise.all (in the matches.map block) validate that
analysis.a_depth_lower and analysis.a_depth_upper are present and are finite
numbers (e.g., Number.isFinite(Number(...))) and either skip/reject the
offending analysis (or return a rejected Promise) with a clear error so
addSoilAnalysis never receives NaN; reference the analysesData lookup, the
matches.map callback, and the a_depth_lower/a_depth_upper values when
implementing this pre-check.
In `@fdm-app/app/routes/farm.create`.$b_id_farm.$calendar.soil-analysis.bulk.tsx:
- Line 51: Remove the unnecessary await when calling the synchronous function
getCalendar in the bulk soil-analysis route: change the usage that assigns to
calendar from "await getCalendar(params)" to a plain synchronous call
"getCalendar(params)"; update any related variable assignment in that scope
(calendar) to reflect the synchronous return type so no Promise handling is used
and ensure consistency with the loader which already calls getCalendar without
await.
🧹 Nitpick comments (5)
fdm-app/app/components/blocks/soil/bulk-upload-review.tsx (1)
18-34: Consider exporting theFieldtype for reuse.The
ProcessedAnalysistype is exported butFieldtype is not. Since both route files use the same field structure, exportingFieldwould improve type consistency across consumers.-type Field = { +export type Field = { b_id: string b_name: string }fdm-app/app/routes/farm.$b_id_farm.soil-analysis.bulk.tsx (2)
112-112: Unused import:navigateis declared but never used.The
navigatevariable fromuseNavigate()is not used in this component.♻️ Proposed fix
- const navigate = useNavigate()And update the import:
-import { data, type LoaderFunctionArgs, type ActionFunctionArgs, useLoaderData, useNavigate, useNavigation, useSubmit } from "react-router" +import { data, type LoaderFunctionArgs, type ActionFunctionArgs, useLoaderData, useNavigation, useSubmit } from "react-router"
118-151: Consider extracting matching logic to a shared utility.The
handleUploadSuccessfunction with geometry and name matching is duplicated verbatim infarm.create.$b_id_farm.$calendar.soil-analysis.bulk.tsx. Extracting this to a shared utility (e.g.,~/lib/soil-analysis-matching.ts) would improve maintainability.fdm-app/app/routes/farm.create.$b_id_farm.$calendar.soil-analysis.bulk.tsx (2)
102-102: Unused import:navigateis declared but never used.Same issue as in the other bulk route file.
♻️ Proposed fix
- const navigate = useNavigate()And update the import:
-import { data, type LoaderFunctionArgs, type ActionFunctionArgs, useLoaderData, useNavigate, useNavigation, useSubmit } from "react-router" +import { data, type LoaderFunctionArgs, type ActionFunctionArgs, useLoaderData, useNavigation, useSubmit } from "react-router"
108-139: Duplicated matching logic should be consolidated.This
handleUploadSuccessimplementation is identical to the one infarm.$b_id_farm.soil-analysis.bulk.tsx. Consider extracting to a shared utility function.
Codecov Report❌ Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## development #449 +/- ##
===============================================
- Coverage 88.07% 88.04% -0.04%
===============================================
Files 91 91
Lines 4621 4625 +4
Branches 1492 1494 +2
===============================================
+ Hits 4070 4072 +2
- Misses 551 553 +2
Flags with carried forward coverage won't be shown. Click here to find out more. ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
There was a problem hiding this comment.
Actionable comments posted: 3
🤖 Fix all issues with AI agents
In `@fdm-app/app/components/blocks/soil/bulk-upload-review.tsx`:
- Around line 193-197: The saved matches currently only filter empty strings but
still include selections equal to "none"; update the handleSave logic (function
handleSave, variable matches and the onSave call) to exclude entries where
fieldId is an empty string or exactly "none" (case-sensitive per the contract)
before mapping to { analysisId, fieldId } so onSave only receives real mappings.
In `@fdm-app/app/integrations/nmi.ts`:
- Around line 16-24: The validatePdfMagicBytes function currently only checks
PDF magic bytes but needs a server-side size guard; update validatePdfMagicBytes
to first verify file.size is <= 5 * 1024 * 1024 (5 MB) and throw a clear Error
(e.g., "invalid: Bestand \"<name>\" is te groot, limiet 5MB") if it exceeds that
limit before reading the ArrayBuffer, then proceed with the existing
fileTypeFromBuffer checks; reference the validatePdfMagicBytes function and the
File parameter to locate the change.
In `@fdm-app/app/routes/farm`.$b_id_farm.soil-analysis.bulk.tsx:
- Around line 214-243: The code currently constructs new
Date(analysis.b_sampling_date) without validation; replicate the a_depth_lower
validation: parse and check analysis.b_sampling_date (e.g., const samplingDate =
new Date(analysis.b_sampling_date); if (!analysis.b_sampling_date ||
Number.isNaN(samplingDate.getTime())) { console.warn(...) ; return }) before
calling addSoilAnalysis so invalid or missing sampling dates are skipped; ensure
you use samplingDate when calling addSoilAnalysis (function name
addSoilAnalysis, variables analysis.b_sampling_date, analysesData, depthLower,
depthUpper).
🧹 Nitpick comments (1)
fdm-app/app/components/blocks/soil/bulk-upload-form.tsx (1)
200-205: Disable “Start Analyse” when no files are selected.This avoids a no-op click path and makes the state clearer.
🔧 Suggested tweak
- <Button + <Button onClick={handleUpload} - disabled={isUploading} + disabled={isUploading || files.length === 0} className="w-full lg:w-auto min-w-[140px]" >
|
It seems I can still try to couple an invalid PDF, which brings me to the Opslaan & Koppelen stage. Then I get an internal server error |
Thanks, I resolved it in 6745a11 |
There was a problem hiding this comment.
Actionable comments posted: 1
🤖 Fix all issues with AI agents
In `@fdm-app/app/components/blocks/soil/bulk-upload-review.tsx`:
- Around line 105-112: The cell renderer for accessorKey "b_sampling_date" calls
format(new Date(...)) without validating the date, which can throw RangeError;
update the cell function in bulk-upload-review.tsx to construct the Date once
(e.g., const d = new Date(row.original.b_sampling_date)), check validity (using
date-fns isValid(d) or d instanceof Date && !isNaN(d.getTime())), and only call
format(d, "P", { locale: nl }) when valid; otherwise return "-" to safely guard
against invalid or malformed dates.
🧹 Nitpick comments (2)
fdm-app/app/components/blocks/soil/bulk-upload-review.tsx (1)
69-71: Preferundefinedinstead of""for unselected field values.This avoids empty-string selections and aligns with the Select component’s expected “no value” state; it also simplifies filtering.
Based on learnings: In fdm-app Select components, using `undefined` instead of empty string (`""`) as fallback value prevents empty strings from being submitted as form values.♻️ Proposed refactor
- const [matches, setMatches] = useState<Record<string, string>>( - Object.fromEntries(analyses.map((a) => [a.id, a.matchedFieldId || ""])), - ) + const [matches, setMatches] = useState<Record<string, string | undefined>>( + Object.fromEntries( + analyses.map((a) => [a.id, a.matchedFieldId || undefined]), + ), + ) @@ - <Select - value={matches[row.original.id]} + <Select + value={matches[row.original.id]} onValueChange={(value) => handleFieldChange(row.original.id, value) } > @@ - .filter(([analysisId, fieldId]) => { - if (fieldId === "" || fieldId === "none") return false + .filter(([analysisId, fieldId]) => { + if (!fieldId || fieldId === "none") return falseAlso applies to: 146-163, 206-215
fdm-app/app/routes/farm.create.$b_id_farm.$calendar.soil-analysis.bulk.tsx (1)
82-130: Consider extracting shared bulk-save helpers to reduce drift.The matching and validation logic is duplicated across the two bulk routes; a shared utility would reduce maintenance risk if the logic evolves.
Also applies to: 198-253
Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com> Signed-off-by: Sven Verweij <37927107+SvenVw@users.noreply.github.com>
There was a problem hiding this comment.
Actionable comments posted: 1
🤖 Fix all issues with AI agents
In `@fdm-app/app/components/blocks/soil/bulk-upload-review.tsx`:
- Around line 120-139: The badges in bulk-upload-review.tsx (e.g., checks around
row.original.a_som_loi, a_p_al, a_p_cc, a_nmin_cc) only guard against undefined
and will render literal "null" if the API returns null; change the presence
checks to use loose null checks (e.g., value != null) for each of those fields
so both null and undefined are treated as absent, and keep the existing display
formatting (including the "%" suffix for a_som_loi) only when the value is
present.
🧹 Nitpick comments (3)
fdm-app/app/components/blocks/soil/bulk-upload-review.tsx (3)
48-48: Consider narrowing thedatatype fromany.
data: anyin an exported type leaks an untyped surface to consumers. UsingRecord<string, unknown>would enforce explicit type narrowing at usage sites and prevent accidental unsafe access.
167-198: Extract the repeated date-validation logic into a helper.The same
b_sampling_date && !Number.isNaN(new Date(...).getTime())pattern appears in the date column cell (Line 108), the status column cell (Line 173),handleSave(Line 212), the table-body row styling (Line 254), and the disabled-button predicate (Line 324). A single helper eliminates the duplication and the repeatednew Dateallocations.♻️ Proposed helper
+const isValidDate = (dateStr: string | undefined | null): boolean => { + if (!dateStr) return false + const d = new Date(dateStr) + return !Number.isNaN(d.getTime()) +}Then replace each inline check, e.g.:
- const isValid = - row.original.b_sampling_date && - !Number.isNaN(new Date(row.original.b_sampling_date).getTime()) + const isValid = isValidDate(row.original.b_sampling_date)
317-328: Thedisabledpredicate duplicates thehandleSavefilter logic.Lines 320-327 reimplement the same filter as
handleSave(Lines 208-216). If eligibility criteria change, both must stay in sync. Consider extracting a shared derived value.♻️ Proposed approach
Compute the valid matches once (e.g., via
useMemo) and reuse in both places:+ const validMatches = useMemo( + () => + Object.entries(matches) + .filter(([analysisId, fieldId]) => { + if (fieldId === "" || fieldId === "none") return false + const analysis = analyses.find((a) => a.id === analysisId) + return isValidDate(analysis?.b_sampling_date) + }) + .map(([analysisId, fieldId]) => ({ analysisId, fieldId })), + [matches, analyses], + ) + const handleSave = () => { - const result = Object.entries(matches) - .filter(([analysisId, fieldId]) => { ... }) - .map(...) - onSave(result) + onSave(validMatches) }Then in the button:
<Button onClick={handleSave} - disabled={ - !Object.entries(matches).some(([analysisId, fieldId]) => { ... }) - } + disabled={validMatches.length === 0} >
…are estimates and multiple pdf's can be uploaded
Summary by CodeRabbit
New Features
Bug Fixes
Documentation
Closes #439