feat: NiFi 2.x upgrade, ABACEnforcement processor, and passthrough mode for ConvertToZTDF#63
feat: NiFi 2.x upgrade, ABACEnforcement processor, and passthrough mode for ConvertToZTDF#63JBCongdon wants to merge 10 commits intoopentdf:mainfrom
Conversation
|
Note Reviews pausedIt looks like this branch is under active development. To avoid overwhelming you with review comments due to an influx of new commits, CodeRabbit has automatically paused this review. You can configure this behavior by changing the Use the following commands to manage reviews:
Use the checkboxes below for quick actions:
📝 WalkthroughWalkthroughAdds a new ABACEnforcement NiFi processor; changes several PropertyDescriptor expression-language scopes from VARIABLE_REGISTRY to ENVIRONMENT; introduces ENABLE_ENCRYPTION in ConvertToZTDF; registers new processors; updates Maven/Java/NiFi toolchain versions; adds a test dependency and a small test-runner change. Changes
Sequence Diagram(s)sequenceDiagram
participant Client as NiFi FlowFile Source
participant ABAC as ABACEnforcement
participant DSP as Authorization Service
Client->>ABAC: Submit FlowFile
activate ABAC
ABAC->>ABAC: Resolve entity (CLIENT_ID/EMAIL/USERNAME)
ABAC->>ABAC: Determine resource attribute FQNs (attr or default)
ABAC->>ABAC: Build GetDecisionsRequest
ABAC->>DSP: Timed gRPC call getDecisions()
activate DSP
DSP-->>ABAC: Decision response
deactivate DSP
ABAC->>ABAC: Compute overall decision (any non-PERMIT => DENY)
ABAC->>ABAC: Set attributes (abac.decision, abac.entity_id, abac.resource_attributes, abac.processing_time_ms)
alt Decision = PERMIT
ABAC-->>Client: Transfer to 'permit'
else Decision = DENY
ABAC-->>Client: Transfer to 'deny'
else RPC/Error
alt Fail Open = true
ABAC->>ABAC: Set PERMIT and abac.error
ABAC-->>Client: Transfer to 'permit'
else Fail Open = false
ABAC-->>Client: Transfer to 'failure'
end
end
deactivate ABAC
Estimated code review effort🎯 4 (Complex) | ⏱️ ~60 minutes Poem
🚥 Pre-merge checks | ✅ 2 | ❌ 1❌ Failed checks (1 warning)
✅ Passed checks (2 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Actionable comments posted: 4
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (1)
nifi-tdf-processors/src/main/java/io/opentdf/nifi/AbstractTDFProcessor.java (1)
42-49:⚠️ Potential issue | 🟠 MajorUse the helper method to evaluate ENVIRONMENT-scoped expressions for
FLOWFILE_PULL_SIZE.Line 46 declares ENVIRONMENT expression language support, but line 195 calls
asInteger()directly without evaluating expressions. The class already providesgetPropertyValue(PropertyValue propertyValue)(line 112) which checks for and evaluates expressions. If operators configure the pull limit via environment variables or system properties (e.g.,${env:PULL_SIZE}), this code path will fail to resolve them and crash at runtime.Suggested fix
`@Override` public void onTrigger(ProcessContext processContext, ProcessSession processSession) throws ProcessException { - List<FlowFile> flowFiles = processSession.get(processContext.getProperty(FLOWFILE_PULL_SIZE).asInteger()); + final Integer pullSize = getPropertyValue(processContext.getProperty(FLOWFILE_PULL_SIZE)).asInteger(); + if (pullSize == null) { + throw new ProcessException("FlowFile queue pull limit must resolve to an integer"); + } + List<FlowFile> flowFiles = processSession.get(pullSize); if (!flowFiles.isEmpty()) { processFlowFiles(processContext, processSession, flowFiles); } }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@nifi-tdf-processors/src/main/java/io/opentdf/nifi/AbstractTDFProcessor.java` around lines 42 - 49, The FLOWFILE_PULL_SIZE PropertyDescriptor supports ENVIRONMENT expression language but callers are calling PropertyValue.asInteger() directly; change those call sites to first evaluate the PropertyValue via the existing getPropertyValue(PropertyValue propertyValue) helper and then call asInteger() on the returned PropertyValue so environment/system expressions (e.g., ${env:PULL_SIZE}) are resolved; specifically, replace direct uses of FLOWFILE_PULL_SIZE.asInteger() (or direct PropertyValue.asInteger() calls) with getPropertyValue(FLOWFILE_PULL_SIZE).asInteger() (or pass the PropertyValue into getPropertyValue(...) before asInteger()) so expressions are evaluated correctly.
🧹 Nitpick comments (1)
nifi-tdf-processors/src/main/java/io/opentdf/nifi/ConvertToZTDF.java (1)
213-219: Add a passthrough-mode test.The provided
ConvertToZTDFTestcoverage only exercises the encryption-enabled path. This new bypass branch changes both content and attribute behavior, so it needs a focused test asserting the original payload is preserved andmime.typeis not rewritten when encryption is disabled.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@nifi-tdf-processors/src/main/java/io/opentdf/nifi/ConvertToZTDF.java` around lines 213 - 219, Add a new unit test in ConvertToZTDFTest that exercises the passthrough branch when encryptionEnabled is false: initialize the processor (ConvertToZTDF) with encryptionEnabled set to false (or mock the property), create a FlowFile with a known payload and mime.type attribute, run the processor, and assert the FlowFile was transferred to REL_SUCCESS, the content matches the original payload (no wrapping), and the mime.type attribute was not rewritten; use the same test harness/mocks used by other tests so you exercise processSession.transfer and validate attributes and content preservation.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@nifi-tdf-processors/src/main/java/io/opentdf/nifi/ABACEnforcement.java`:
- Around line 179-191: The current flow builds resource attributes (attrFqnsCsv
/ defaultAttrFqns) and throws a ProcessException for local request-construction
problems, but the later "Fail Open" handling is broad and turns these local
validation errors into permits; change the error-handling so that validation
errors (e.g., missing/blank attrFqnsCsv) always raise a ProcessException and are
NOT converted to permit, and only map failures from the remote authorization
call to permit when the Fail Open flag is set — i.e., keep the existing checks
in ABACEnforcement that validate attributes and throw ProcessException for local
issues, and narrow the catch around the remote authorization invocation (the
client/authorize call) to only catch remote/network/service exceptions
(IOExceptions, client-specific runtime exceptions) and apply Fail Open there; do
not catch or swallow ProcessException or other local validation exceptions when
applying Fail Open.
- Around line 256-257: The current info log in ABACEnforcement uses
getLogger().info(...) and prints raw entityId (PII like EMAIL/USERNAME); change
this to redact or omit entityId at info level and either log a redacted form
(e.g., hash or masked string) or move the full entityId to debug level. Update
the call that references decisionLabel, entityId, attrFqnsCsv, elapsedMs so the
info message contains decisionLabel, attrFqnsCsv and elapsedMs only (or a masked
entity token), and add a separate debug log that includes the original entityId
for troubleshooting (use the same getLogger().debug(...) with entityId). Ensure
the redaction/masking is applied consistently where entityId is used in
ABACEnforcement.
- Around line 233-245: The code currently defaults overallDecision to
DECISION_PERMIT which lets empty/unknown decision sets pass; change the decision
logic in the GetDecisionsResponse handling (authStub.getDecisions /
GetDecisionsResponse / DecisionResponse loop) so that unknown/empty results
default to DECISION_DENY: initialize overallDecision to
DecisionResponse.Decision.DECISION_DENY, iterate responses and if any
DecisionResponse is DECISION_DENY keep/break as DENY, mark if a DECISION_PERMIT
is seen, and only set overallDecision to DECISION_PERMIT when at least one
PERMIT is observed and no DENY was found; this ensures empty or unhandled states
remain DENY.
In `@nifi-tdf-processors/src/main/java/io/opentdf/nifi/ConvertToZTDF.java`:
- Around line 73-82: The ENABLE_ENCRYPTION property currently can yield null
from PropertyValue.asBoolean() when an ENVIRONMENT-scoped expression fails,
allowing silent bypass; update the code that reads this property (e.g., the
onTrigger method in ConvertToZTDF and the equivalent reader in
ABACEnforcement.java) to explicitly check for a null return from
context.getProperty(ENABLE_ENCRYPTION).asBoolean() and throw a clear exception
(or signal validation failure) instead of treating null as false; alternatively,
add an explicit boolean validator to the ENABLE_ENCRYPTION PropertyDescriptor so
unresolved/invalid ENVIRONMENT expressions are rejected at validation time
(reference ENABLE_ENCRYPTION and the null-check pattern used in
SimpleOpenTDFControllerService.java).
---
Outside diff comments:
In `@nifi-tdf-processors/src/main/java/io/opentdf/nifi/AbstractTDFProcessor.java`:
- Around line 42-49: The FLOWFILE_PULL_SIZE PropertyDescriptor supports
ENVIRONMENT expression language but callers are calling
PropertyValue.asInteger() directly; change those call sites to first evaluate
the PropertyValue via the existing getPropertyValue(PropertyValue propertyValue)
helper and then call asInteger() on the returned PropertyValue so
environment/system expressions (e.g., ${env:PULL_SIZE}) are resolved;
specifically, replace direct uses of FLOWFILE_PULL_SIZE.asInteger() (or direct
PropertyValue.asInteger() calls) with
getPropertyValue(FLOWFILE_PULL_SIZE).asInteger() (or pass the PropertyValue into
getPropertyValue(...) before asInteger()) so expressions are evaluated
correctly.
---
Nitpick comments:
In `@nifi-tdf-processors/src/main/java/io/opentdf/nifi/ConvertToZTDF.java`:
- Around line 213-219: Add a new unit test in ConvertToZTDFTest that exercises
the passthrough branch when encryptionEnabled is false: initialize the processor
(ConvertToZTDF) with encryptionEnabled set to false (or mock the property),
create a FlowFile with a known payload and mime.type attribute, run the
processor, and assert the FlowFile was transferred to REL_SUCCESS, the content
matches the original payload (no wrapping), and the mime.type attribute was not
rewritten; use the same test harness/mocks used by other tests so you exercise
processSession.transfer and validate attributes and content preservation.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: Organization UI
Review profile: CHILL
Plan: Pro
Run ID: 0f3e6f43-cb70-47bb-973d-2be1da2225b2
📒 Files selected for processing (7)
nifi-tdf-processors/src/main/java/io/opentdf/nifi/ABACEnforcement.javanifi-tdf-processors/src/main/java/io/opentdf/nifi/AbstractTDFProcessor.javanifi-tdf-processors/src/main/java/io/opentdf/nifi/AbstractToProcessor.javanifi-tdf-processors/src/main/java/io/opentdf/nifi/ConvertToZTDF.javanifi-tdf-processors/src/main/java/io/opentdf/nifi/SimpleOpenTDFControllerService.javanifi-tdf-processors/src/main/resources/META-INF/services/org.apache.nifi.processor.Processorpom.xml
33e79d2 to
9bcb952
Compare
There was a problem hiding this comment.
Actionable comments posted: 1
♻️ Duplicate comments (1)
nifi-tdf-processors/src/main/java/io/opentdf/nifi/ABACEnforcement.java (1)
182-185:⚠️ Potential issue | 🟠 MajorValidate
entityIdafter expression evaluation.
NON_BLANK_VALIDATORvalidates the property expression string (e.g.,${jwt.sub}), not the runtime-evaluated result. If the referenced flow file attribute is missing or empty,entityIdwill be null/blank, creating an invalid authorization request. This is a local validation failure that should route toREL_FAILURE.🐛 Proposed fix
// Resolve entity ID String entityId = ctx.getProperty(ENTITY_ID) .evaluateAttributeExpressions(flowFile).getValue(); + if (entityId == null || entityId.isBlank()) { + throw new ProcessException("Entity ID resolved to blank — check flow file attributes or property configuration"); + } String entityType = ctx.getProperty(ENTITY_TYPE).getValue();🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@nifi-tdf-processors/src/main/java/io/opentdf/nifi/ABACEnforcement.java` around lines 182 - 185, After evaluating ENTITY_ID into the runtime variable entityId in ABACEnforcement (the onTrigger/process method where ctx.getProperty(ENTITY_ID).evaluateAttributeExpressions(...) is called), validate that entityId is not null/empty/blank; if it is blank, set a helpful flowfile attribute (e.g., "abac.error" or similar) and transfer the flowfile to the REL_FAILURE relationship (session.transfer(flowFile, REL_FAILURE)) and return so the invalid authorization request is not sent; ensure you use the evaluated entityId value (not the property expression) for this check and reference ENTITY_ID, entityId, and REL_FAILURE to locate the change.
🧹 Nitpick comments (3)
nifi-tdf-processors/src/main/java/io/opentdf/nifi/ABACEnforcement.java (3)
62-71: Missingabac.errorin@WritesAttributes.Line 282 sets
abac.errorwhen fail-open is triggered, but this attribute is not documented in the@WritesAttributesannotation. Add it for completeness.📝 Proposed fix
`@WritesAttributes`({ `@WritesAttribute`(attribute = "abac.decision", description = "PERMIT or DENY"), `@WritesAttribute`(attribute = "abac.entity_id", description = "The entity ID used in the authorization request"), `@WritesAttribute`(attribute = "abac.resource_attributes", description = "Comma-separated resource attribute FQNs that were evaluated"), `@WritesAttribute`(attribute = "abac.processing_time_ms", description = "Time taken for the GetDecisions call in milliseconds"), + `@WritesAttribute`(attribute = "abac.error", + description = "Error message when fail-open is triggered due to authorization service failure"), })🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@nifi-tdf-processors/src/main/java/io/opentdf/nifi/ABACEnforcement.java` around lines 62 - 71, The `@WritesAttributes` on ABACEnforcement is missing the "abac.error" attribute which is set when the processor triggers fail-open behavior; update the `@WritesAttributes` annotation (in the ABACEnforcement class) to include `@WritesAttribute`(attribute = "abac.error", description = "Error message when authorization lookup fails (present when fail-open is used)") so the attribute is documented alongside abac.decision, abac.entity_id, abac.resource_attributes, and abac.processing_time_ms.
86-89: Consider usingSet.of()for immutable relationship set.
Set.of()provides an immutable set directly without intermediate list allocation.♻️ Proposed fix
`@Override` public Set<Relationship> getRelationships() { - return new HashSet<>(Arrays.asList(REL_PERMIT, REL_DENY, REL_FAILURE)); + return Set.of(REL_PERMIT, REL_DENY, REL_FAILURE); }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@nifi-tdf-processors/src/main/java/io/opentdf/nifi/ABACEnforcement.java` around lines 86 - 89, Replace the mutable HashSet allocation in getRelationships() with an immutable Set created via Set.of to avoid the intermediate list and make the relationships unmodifiable; specifically change the method getRelationships() to return Set.of(REL_PERMIT, REL_DENY, REL_FAILURE) (ensure the method references REL_PERMIT, REL_DENY, REL_FAILURE and that your codebase targets Java 9+).
164-166: Consider null-checking the authorization stub.If the SDK or its services are improperly configured,
sdk.getServices().authorization()could return null, leading to an NPE on line 237. A defensive check would provide a clearer error message.🛡️ Proposed fix
SDK sdk = getTDFSDK(ctx); AuthorizationServiceGrpc.AuthorizationServiceFutureStub authStub = sdk.getServices().authorization(); + if (authStub == null) { + throw new ProcessException("Authorization service stub is unavailable — check SDK configuration"); + }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@nifi-tdf-processors/src/main/java/io/opentdf/nifi/ABACEnforcement.java` around lines 164 - 166, The call to sdk.getServices().authorization() can return null and later cause an NPE; inside the method where you call getTDFSDK(ctx) and assign AuthorizationServiceGrpc.AuthorizationServiceFutureStub authStub (symbols: getTDFSDK, SDK, AuthorizationServiceGrpc.AuthorizationServiceFutureStub, authStub), add a defensive null-check after obtaining authStub (and optionally after sdk or sdk.getServices()), and if null log a clear error via process logger or throw a descriptive IllegalStateException indicating the authorization service is not configured so callers fail fast with an informative message.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@nifi-tdf-processors/src/main/java/io/opentdf/nifi/ABACEnforcement.java`:
- Around line 212-219: The loop building ResourceAttribute.attributeValueFqns
from attrFqnsCsv can result in zero FQNs (e.g., input " , , "); after
trimming/splitting but before calling raBuilder.build(), validate that at least
one non-empty FQN was added (inspect raBuilder.getAttributeValueFqnsCount() or
track a counter) and if none, log an error and fail fast (throw a
ProcessException/IllegalArgumentException or route to failure) instead of
sending an authorization request with empty attributeValueFqns; update the code
around ResourceAttribute.Builder raBuilder, attrFqnsCsv and resourceAttribute to
perform this check and handle the error path.
---
Duplicate comments:
In `@nifi-tdf-processors/src/main/java/io/opentdf/nifi/ABACEnforcement.java`:
- Around line 182-185: After evaluating ENTITY_ID into the runtime variable
entityId in ABACEnforcement (the onTrigger/process method where
ctx.getProperty(ENTITY_ID).evaluateAttributeExpressions(...) is called),
validate that entityId is not null/empty/blank; if it is blank, set a helpful
flowfile attribute (e.g., "abac.error" or similar) and transfer the flowfile to
the REL_FAILURE relationship (session.transfer(flowFile, REL_FAILURE)) and
return so the invalid authorization request is not sent; ensure you use the
evaluated entityId value (not the property expression) for this check and
reference ENTITY_ID, entityId, and REL_FAILURE to locate the change.
---
Nitpick comments:
In `@nifi-tdf-processors/src/main/java/io/opentdf/nifi/ABACEnforcement.java`:
- Around line 62-71: The `@WritesAttributes` on ABACEnforcement is missing the
"abac.error" attribute which is set when the processor triggers fail-open
behavior; update the `@WritesAttributes` annotation (in the ABACEnforcement class)
to include `@WritesAttribute`(attribute = "abac.error", description = "Error
message when authorization lookup fails (present when fail-open is used)") so
the attribute is documented alongside abac.decision, abac.entity_id,
abac.resource_attributes, and abac.processing_time_ms.
- Around line 86-89: Replace the mutable HashSet allocation in
getRelationships() with an immutable Set created via Set.of to avoid the
intermediate list and make the relationships unmodifiable; specifically change
the method getRelationships() to return Set.of(REL_PERMIT, REL_DENY,
REL_FAILURE) (ensure the method references REL_PERMIT, REL_DENY, REL_FAILURE and
that your codebase targets Java 9+).
- Around line 164-166: The call to sdk.getServices().authorization() can return
null and later cause an NPE; inside the method where you call getTDFSDK(ctx) and
assign AuthorizationServiceGrpc.AuthorizationServiceFutureStub authStub
(symbols: getTDFSDK, SDK,
AuthorizationServiceGrpc.AuthorizationServiceFutureStub, authStub), add a
defensive null-check after obtaining authStub (and optionally after sdk or
sdk.getServices()), and if null log a clear error via process logger or throw a
descriptive IllegalStateException indicating the authorization service is not
configured so callers fail fast with an informative message.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: Organization UI
Review profile: CHILL
Plan: Pro
Run ID: 50c0a517-e408-4adf-a58a-b7a16ef9648a
📒 Files selected for processing (8)
nifi-tdf-processors/src/main/java/io/opentdf/nifi/ABACEnforcement.javanifi-tdf-processors/src/main/java/io/opentdf/nifi/AbstractTDFProcessor.javanifi-tdf-processors/src/main/java/io/opentdf/nifi/AbstractToProcessor.javanifi-tdf-processors/src/main/java/io/opentdf/nifi/ConvertToZTDF.javanifi-tdf-processors/src/main/java/io/opentdf/nifi/SimpleOpenTDFControllerService.javanifi-tdf-processors/src/main/resources/META-INF/services/org.apache.nifi.processor.Processornifi-tdf-processors/src/test/java/io/opentdf/nifi/Utils.javapom.xml
✅ Files skipped from review due to trivial changes (2)
- nifi-tdf-processors/src/test/java/io/opentdf/nifi/Utils.java
- nifi-tdf-processors/src/main/resources/META-INF/services/org.apache.nifi.processor.Processor
🚧 Files skipped from review as they are similar to previous changes (5)
- nifi-tdf-processors/src/main/java/io/opentdf/nifi/AbstractTDFProcessor.java
- nifi-tdf-processors/src/main/java/io/opentdf/nifi/SimpleOpenTDFControllerService.java
- nifi-tdf-processors/src/main/java/io/opentdf/nifi/AbstractToProcessor.java
- nifi-tdf-processors/src/main/java/io/opentdf/nifi/ConvertToZTDF.java
- pom.xml
- Upgrade NiFi to 2.7.0 and Java 21; migrate ExpressionLanguageScope from VARIABLE_REGISTRY to ENVIRONMENT across all processors and controller service (NiFi 2.x breaking change) - Add ABACEnforcement processor: calls DSP GetDecisions to make ABAC permit/deny decisions on any flow file with tdf_attribute set; routes to permit/deny/failure relationships; designed for binary protocol flows (e.g. Link 16/JREAP-C) that cannot be TDF-wrapped - Add Enable Encryption property to ConvertToZTDF: when false, passes flow files through without TDF wrapping; mirrors the behavior of GATEWAY_ABAC_ENCRYPT_EMAIL=0 in the Virtru Gateway (tag-only mode) Signed-off-by: Joel C. <143835810+JBCongdon@users.noreply.github.com>
…assNotFoundException SSLContextService in NiFi 2.x references TlsConfiguration (org.apache.nifi.security.util) at class-load time. When Mockito/byte-buddy instruments the interface, it must resolve all referenced types — including the return type of getTlsConfiguration() — or fail with ClassNotFoundException. nifi-security-utils-api is the correct Maven Central artifact that contains TlsConfiguration for NiFi 2.x; nifi-security-utils is not published. Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com> Signed-off-by: Joel C. <143835810+JBCongdon@users.noreply.github.com>
…sions
- ABACEnforcement: throw ProcessException when attrFqnsCsv parses to zero
non-blank FQNs (e.g. " , , ") instead of sending an empty-constraint
authorization request that some services may permit by default
- AbstractTDFProcessor: call evaluateAttributeExpressions() before asInteger()
on FLOWFILE_PULL_SIZE so environment-variable references (e.g. ${env:PULL_SIZE})
are resolved correctly at runtime
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
Signed-off-by: Joel C. <143835810+JBCongdon@users.noreply.github.com>
b3e9e2f to
106ce8f
Compare
|
Re: CodeRabbit — FLOWFILE_PULL_SIZE expression evaluation Fixed in 106ce8f: changed |
1 similar comment
|
Re: CodeRabbit — FLOWFILE_PULL_SIZE expression evaluation Fixed in 106ce8f: changed |
There was a problem hiding this comment.
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (1)
nifi-tdf-processors/src/main/java/io/opentdf/nifi/ConvertToZTDF.java (1)
62-82:⚠️ Potential issue | 🟡 MinorJavadoc describes wrong property.
The Javadoc block (lines 62-72) describes the "Sign Assertions" property but is placed directly above the
ENABLE_ENCRYPTIONproperty definition. This appears to be a copy-paste artifact.📝 Proposed fix
/** - * Property descriptor for the "Sign Assertions" feature in the ConvertToZTDF processor. This property allows specifying whether - * the assertions should be signed or not. It is not a required property and defaults to "false". - * <p> - * - Name: Sign Assertions - * - Description: sign assertions - * - Required: false - * - Default Value: false - * - Allowable Values: true, false - * - Expression Language Supported: {`@link` ExpressionLanguageScope#ENVIRONMENT} + * Property descriptor for the "Enable Encryption" feature in the ConvertToZTDF processor. + * When set to false, the flow file passes through without TDF encryption for ABAC policy + * enforcement only (tag-only mode). + * <p> + * - Name: Enable Encryption + * - Description: Controls whether TDF encryption is applied + * - Required: true + * - Default Value: true + * - Allowable Values: true, false + * - Expression Language Supported: {`@link` ExpressionLanguageScope#ENVIRONMENT} */ public static final PropertyDescriptor ENABLE_ENCRYPTION = new org.apache.nifi.components.PropertyDescriptor.Builder()🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@nifi-tdf-processors/src/main/java/io/opentdf/nifi/ConvertToZTDF.java` around lines 62 - 82, The Javadoc above the ENABLE_ENCRYPTION PropertyDescriptor is copy-pasted for a different property ("Sign Assertions"); replace or rewrite that Javadoc to accurately document ENABLE_ENCRYPTION (name "Enable Encryption", description matching the .description text, required=true, defaultValue="true", allowableValues "true","false", and ExpressionLanguageScope.ENVIRONMENT) so the comment matches the actual PropertyDescriptor declaration (or simply remove the incorrect Javadoc block if you prefer inline documentation only).
♻️ Duplicate comments (1)
nifi-tdf-processors/src/main/java/io/opentdf/nifi/ABACEnforcement.java (1)
182-204:⚠️ Potential issue | 🟠 MajorMissing validation for entityId after expression evaluation.
The
entityIdresolved from expression language (line 183-184) is used directly in the entity builder without validation. If the expression (e.g.,${jwt.sub}) evaluates tonullor blank, the authorization request will contain an entity with no meaningful identifier. This could lead to undefined authorization behavior depending on how the service handles empty identifiers.The
NON_BLANK_VALIDATORon the property only validates that the expression string is non-blank, not the evaluated result.🛡️ Proposed fix to validate entityId
// Resolve entity ID String entityId = ctx.getProperty(ENTITY_ID) .evaluateAttributeExpressions(flowFile).getValue(); + if (entityId == null || entityId.isBlank()) { + throw new ProcessException("Entity ID resolved to null or blank"); + } String entityType = ctx.getProperty(ENTITY_TYPE).getValue();🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@nifi-tdf-processors/src/main/java/io/opentdf/nifi/ABACEnforcement.java` around lines 182 - 204, The code uses the evaluated entityId (from PROPERTY ENTITY_ID via ctx.getProperty(...).evaluateAttributeExpressions(flowFile).getValue()) without validating the evaluated result, so ensure the evaluated entityId is non-null and non-blank before building the Entity: after computing entityId, check for null/blank and throw a ProcessException (or route to failure) with a clear message if invalid; then proceed to use entityId in Entity.Builder (entityBuilder) for setEmailAddress/setUserName/setClientId only when valid. Also include the property name (ENTITY_ID) in the error message so it's easy to trace failing evaluations.
🧹 Nitpick comments (1)
nifi-tdf-processors/src/main/java/io/opentdf/nifi/ABACEnforcement.java (1)
240-241: Consider restoring interrupt flag when catchingInterruptedException.
Future.get(timeout, TimeUnit)at line 240-241 throwsInterruptedExceptionwhich is currently caught by the genericcatch (Exception e)block at line 280. When a thread is interrupted, best practice is to either re-throw or restore the interrupt status so that higher-level code can detect and handle the interruption.♻️ Proposed refactor to handle InterruptedException properly
} catch (ProcessException pe) { // Local validation failures (missing attributes, bad config) are never // fail-open — unclassified or malformed flow files must not bypass policy. getLogger().error("ABAC request validation failed for FlowFile {}: {}", flowFile.getId(), pe.getMessage()); session.transfer(flowFile, REL_FAILURE); + } catch (InterruptedException ie) { + Thread.currentThread().interrupt(); + getLogger().error("ABAC authorization call interrupted for FlowFile {}", flowFile.getId(), ie); + session.transfer(flowFile, REL_FAILURE); } catch (Exception e) { // Remote call failures (network, timeout, service unavailable) respect failOpen. getLogger().error("ABAC authorization call failed for FlowFile {}", flowFile.getId(), e);Also applies to: 280-290
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@nifi-tdf-processors/src/main/java/io/opentdf/nifi/ABACEnforcement.java` around lines 240 - 241, The getDecisions call (GetDecisionsResponse response = authStub.getDecisions(request).get(timeoutSeconds, TimeUnit.SECONDS)) can throw InterruptedException which is currently swallowed by the generic catch in ABACEnforcement; update the exception handling in the method in class ABACEnforcement so you catch InterruptedException separately, restore the interrupt status with Thread.currentThread().interrupt() (or rethrow if preferred), and handle other exceptions (ExecutionException, TimeoutException, IOException, etc.) in their own catch blocks or the existing generic catch to preserve correct control flow and allow higher-level code to observe the interruption.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Outside diff comments:
In `@nifi-tdf-processors/src/main/java/io/opentdf/nifi/ConvertToZTDF.java`:
- Around line 62-82: The Javadoc above the ENABLE_ENCRYPTION PropertyDescriptor
is copy-pasted for a different property ("Sign Assertions"); replace or rewrite
that Javadoc to accurately document ENABLE_ENCRYPTION (name "Enable Encryption",
description matching the .description text, required=true, defaultValue="true",
allowableValues "true","false", and ExpressionLanguageScope.ENVIRONMENT) so the
comment matches the actual PropertyDescriptor declaration (or simply remove the
incorrect Javadoc block if you prefer inline documentation only).
---
Duplicate comments:
In `@nifi-tdf-processors/src/main/java/io/opentdf/nifi/ABACEnforcement.java`:
- Around line 182-204: The code uses the evaluated entityId (from PROPERTY
ENTITY_ID via
ctx.getProperty(...).evaluateAttributeExpressions(flowFile).getValue()) without
validating the evaluated result, so ensure the evaluated entityId is non-null
and non-blank before building the Entity: after computing entityId, check for
null/blank and throw a ProcessException (or route to failure) with a clear
message if invalid; then proceed to use entityId in Entity.Builder
(entityBuilder) for setEmailAddress/setUserName/setClientId only when valid.
Also include the property name (ENTITY_ID) in the error message so it's easy to
trace failing evaluations.
---
Nitpick comments:
In `@nifi-tdf-processors/src/main/java/io/opentdf/nifi/ABACEnforcement.java`:
- Around line 240-241: The getDecisions call (GetDecisionsResponse response =
authStub.getDecisions(request).get(timeoutSeconds, TimeUnit.SECONDS)) can throw
InterruptedException which is currently swallowed by the generic catch in
ABACEnforcement; update the exception handling in the method in class
ABACEnforcement so you catch InterruptedException separately, restore the
interrupt status with Thread.currentThread().interrupt() (or rethrow if
preferred), and handle other exceptions (ExecutionException, TimeoutException,
IOException, etc.) in their own catch blocks or the existing generic catch to
preserve correct control flow and allow higher-level code to observe the
interruption.
ℹ️ Review info
⚙️ Run configuration
Configuration used: Organization UI
Review profile: CHILL
Plan: Pro
Run ID: fa21b028-ea5d-4f14-ba8c-6d2ff7c1ac1a
📒 Files selected for processing (9)
nifi-tdf-processors/pom.xmlnifi-tdf-processors/src/main/java/io/opentdf/nifi/ABACEnforcement.javanifi-tdf-processors/src/main/java/io/opentdf/nifi/AbstractTDFProcessor.javanifi-tdf-processors/src/main/java/io/opentdf/nifi/AbstractToProcessor.javanifi-tdf-processors/src/main/java/io/opentdf/nifi/ConvertToZTDF.javanifi-tdf-processors/src/main/java/io/opentdf/nifi/SimpleOpenTDFControllerService.javanifi-tdf-processors/src/main/resources/META-INF/services/org.apache.nifi.processor.Processornifi-tdf-processors/src/test/java/io/opentdf/nifi/Utils.javapom.xml
✅ Files skipped from review due to trivial changes (2)
- nifi-tdf-processors/src/main/resources/META-INF/services/org.apache.nifi.processor.Processor
- pom.xml
🚧 Files skipped from review as they are similar to previous changes (4)
- nifi-tdf-processors/src/test/java/io/opentdf/nifi/Utils.java
- nifi-tdf-processors/src/main/java/io/opentdf/nifi/AbstractTDFProcessor.java
- nifi-tdf-processors/src/main/java/io/opentdf/nifi/AbstractToProcessor.java
- nifi-tdf-processors/src/main/java/io/opentdf/nifi/SimpleOpenTDFControllerService.java
The Javadoc block above ENABLE_ENCRYPTION incorrectly described the Sign Assertions property. Replaced with accurate description of the Enable Encryption flag and its tag-only / ABAC-only mode behavior. Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com> Signed-off-by: Joel C. <143835810+JBCongdon@users.noreply.github.com>
|
Re: CodeRabbit (20h ago) — FLOWFILE_PULL_SIZE ENVIRONMENT expression evaluation Already addressed in 106ce8f alongside the FQN validation fix: |
There was a problem hiding this comment.
🧹 Nitpick comments (1)
nifi-tdf-processors/src/main/java/io/opentdf/nifi/ConvertToZTDF.java (1)
217-219: Consider adding a flow file attribute to indicate encryption status.When
ENABLE_ENCRYPTIONisfalse, the flow file is still wrapped bycreateTDF()(line 232) and assignedmime.type=application/ztdf+zip(line 242). Downstream processors have no way to distinguish encrypted TDFs from unencrypted ones.Adding an attribute would improve observability and enable conditional downstream routing:
Suggested enhancement
if (!encryptionEnabled) { configurationOptions.add(cfg -> cfg.enableEncryption = false); } // ... updatedFlowFile = processSession.putAttribute(updatedFlowFile, "mime.type", "application/ztdf+zip"); + updatedFlowFile = processSession.putAttribute(updatedFlowFile, "tdf.encryption.enabled", String.valueOf(encryptionEnabled)); processSession.transfer(updatedFlowFile, REL_SUCCESS);Also applies to: 242-242
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@nifi-tdf-processors/src/main/java/io/opentdf/nifi/ConvertToZTDF.java` around lines 217 - 219, When encryptionEnabled is false the processor still wraps the FlowFile via createTDF() and sets mime.type=application/ztdf+zip, so add a FlowFile attribute to indicate encryption status; update the code path around configurationOptions.add(...) and immediately after createTDF()/mime.type assignment to call session.putAttribute on the FlowFile (e.g., "ztdf.encrypted"="false" when encryptionEnabled is false, and set "true" when encryptionEnabled is true) so downstream processors can inspect the attribute to distinguish encrypted vs unencrypted TDFs.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Nitpick comments:
In `@nifi-tdf-processors/src/main/java/io/opentdf/nifi/ConvertToZTDF.java`:
- Around line 217-219: When encryptionEnabled is false the processor still wraps
the FlowFile via createTDF() and sets mime.type=application/ztdf+zip, so add a
FlowFile attribute to indicate encryption status; update the code path around
configurationOptions.add(...) and immediately after createTDF()/mime.type
assignment to call session.putAttribute on the FlowFile (e.g.,
"ztdf.encrypted"="false" when encryptionEnabled is false, and set "true" when
encryptionEnabled is true) so downstream processors can inspect the attribute to
distinguish encrypted vs unencrypted TDFs.
ℹ️ Review info
⚙️ Run configuration
Configuration used: Organization UI
Review profile: CHILL
Plan: Pro
Run ID: 0dbf7ab0-4a35-4eb8-a84c-09df841c74d2
📒 Files selected for processing (1)
nifi-tdf-processors/src/main/java/io/opentdf/nifi/ConvertToZTDF.java
Adds a new NiFi processor that parses JREAP-C (Joint Range Extension Applications Protocol Category C) binary message headers and exposes policy-relevant fields as flow file attributes for downstream ABAC enforcement. Extracted fields: - jreapc.classification / jreapc.classification_code (0=UNCLASSIFIED, 1=CUI, 2=SECRET, 3=TOP SECRET) - jreapc.word_type / jreapc.word_type_code (J3.0, J5.0, J7.0, etc.) - jreapc.exercise, jreapc.simulation (flag bits) - jreapc.sequence_number, jreapc.track_number, jreapc.timestamp - jreapc.source_address, jreapc.destination_address (hex) - jreapc.payload_size Optional 'Classification Attribute Namespace' property: when set, populates tdf_attribute with the DSP attribute FQN so ABACEnforcement can consume it directly without additional transformation. Content is passed through unmodified; only attributes are written. Intended flow: [JREAP-C source] → [ParseJREAPC] → [ABACEnforcement] → permit/deny Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com> Signed-off-by: Joel C. <143835810+JBCongdon@users.noreply.github.com>
MockFlowFile.assertContentEquals(byte[]) throws a checked IOException; the test method must declare it to compile under javac -Xlint:unchecked. Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com> Signed-off-by: Joel C. <143835810+JBCongdon@users.noreply.github.com>
Documents all four processors (ConvertToZTDF, ConvertFromZTDF, ParseJREAPC, ABACEnforcement), the binary protocol ABAC flow pattern, and tag-only mode. Adds a table of contents, flow diagrams in text, and configuration steps. Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com> Signed-off-by: Joel C. <143835810+JBCongdon@users.noreply.github.com>
ttschampel
left a comment
There was a problem hiding this comment.
I think it make sense to house the JREAPC processor in a separate module (still hosted in this repo, just new maven modules) and keep nifi-tdf-processors domain independent.
- Move JREAPC Processor to separate process,nar module.
- Generalize the new ABACEnforcement to remove references to JREAPC
Separates protocol-specific parsing from the domain-independent TDF processor module. nifi-tdf-processors now contains only OpenTDF encryption and generic ABAC enforcement; JREAP-C parsing lives in its own module and NAR that can be deployed independently. - nifi-jreapc-processors: new module containing ParseJREAPC and its tests; depends only on nifi-api and nifi-utils (no OpenTDF SDK) - nifi-jreapc-nar: new NAR packaging for the JREAP-C processor - nifi-tdf-processors: ParseJREAPC and ParseJREAPCTest removed - ABACEnforcement: removed all JREAP-C-specific references from annotations, tags, and Javadoc — the processor is now described as a generic ABAC enforcement component that works with any upstream source that populates tdf_attribute - Root pom.xml: registers both new modules in <modules> Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com> Signed-off-by: Joel C. <143835810+JBCongdon@users.noreply.github.com>
… tests ABACEnforcementTest (11 tests) covers permit, deny, fail-open, fail-closed, missing/blank/empty tdf_attribute validation, default resource attributes, empty decision list, and multi-decision deny-wins logic. JREAPCPipelineTest (3 tests) chains ParseJREAPC → ABACEnforcement → ConvertToZTDF with synthetic JREAP-C binary data, asserting that the original bytes are preserved byte-for-byte through the full pipeline. Adds nifi-jreapc-processors as a test-scope dependency to enable end-to-end instantiation of ParseJREAPC in the TDF processor test suite. Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com> Signed-off-by: Joel C. <143835810+JBCongdon@users.noreply.github.com>
ParseJREAPC moved to nifi-jreapc-processors in the prior refactor commit; update the README link to match the new location. Add a Testing section documenting the test classes, their scope, and the mvn verify command needed to run them. Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com> Signed-off-by: Joel C. <143835810+JBCongdon@users.noreply.github.com>
ttschampel
left a comment
There was a problem hiding this comment.
this looks good to me, can you update the PR to sign your commits?
Summary
This PR upgrades the NiFi integration to NiFi 2.x and adds two new processors for ABAC enforcement on binary protocol streams, plus a dedicated Maven module for JREAP-C support.
Changes
NiFi 2.x upgrade
New: `ABACEnforcement` processor (`nifi-tdf-processors` / `nifi-tdf-nar`)
New: `ParseJREAPC` processor (`nifi-jreapc-processors` / `nifi-jreapc-nar` — separate NAR)
Example pipeline
This pattern enforces attribute-based access control on binary protocol streams that cannot be TDF-wrapped, without modifying the payload bytes.
`ConvertToZTDF` passthrough mode
Test coverage
Files changed