diff --git a/README.md b/README.md index 4662041..1beeb84 100644 --- a/README.md +++ b/README.md @@ -1,33 +1,92 @@ # OpenTDF NiFi -Integration of the [OpenTDF Platform](https://github.com/opentdf/platform) into [NiFi](https://nifi.apache.org/) -Components: -* "Zero Trust Data Format" (ZTDF) Processors: - * [ConvertToZTDF](./nifi-tdf-processors/src/main/java/io/opentdf/nifi/ConvertToZTDF.java): A NiFi processor that converts FlowFile content to ZTDF format. - * [ConvertFromZTDF](./nifi-tdf-processors/src/main/java/io/opentdf/nifi/ConvertFromZTDF.java): A NiFi processor that converts ZTDF formatted FlowFile content to its plaintext representation +Integration of the [OpenTDF Platform](https://github.com/opentdf/platform) into [Apache NiFi](https://nifi.apache.org/). Provides processors for TDF encryption/decryption and attribute-based access control (ABAC) enforcement on any flow file — including binary protocol streams that cannot be TDF-wrapped. -* Controller Services: - * [OpenTDFControllerService](./nifi-tdf-controller-services-api/src/main/java/io/opentdf/nifi/OpenTDFControllerService.java): A NiFi controller service providing OpenTDF Platform Configuration +## Processors + +### TDF Encryption + +| Processor | Description | +|-----------|-------------| +| [ConvertToZTDF](./nifi-tdf-processors/src/main/java/io/opentdf/nifi/ConvertToZTDF.java) | Encrypts flow file content as a Zero Trust Data Format (ZTDF) object, binding OpenTDF policy attributes to the ciphertext. Set **Enable Encryption = false** to tag a flow file with policy attributes without encrypting the payload (ABAC-only / tag-only mode). | +| [ConvertFromZTDF](./nifi-tdf-processors/src/main/java/io/opentdf/nifi/ConvertFromZTDF.java) | Decrypts a ZTDF-formatted flow file back to plaintext using the configured KAS endpoint. | + +### Binary Protocol Support + +| Processor | Description | +|-----------|-------------| +| [ParseJREAPC](./nifi-jreapc-processors/src/main/java/io/opentdf/nifi/ParseJREAPC.java) | Parses JREAP-C (Joint Range Extension Applications Protocol Category C) binary message headers and extracts policy-relevant fields — classification, J-series word type, track number, source/destination addressing, and timestamp — as flow file attributes. Optionally populates `tdf_attribute` automatically from the classification level when a **Classification Attribute Namespace** is configured, making it directly consumable by `ABACEnforcement` downstream. Payload bytes are passed through unmodified. | +| [ABACEnforcement](./nifi-tdf-processors/src/main/java/io/opentdf/nifi/ABACEnforcement.java) | Calls the OpenTDF Authorization Service `GetDecisions` endpoint to make an ABAC permit/deny decision for the flow file. Uses the `tdf_attribute` flow file attribute as the resource context. Routes to **permit**, **deny**, or **failure** relationships. Supports a **Fail Open** property to control behavior when the authorization service is unreachable. Designed to enforce policy on binary protocol streams (JREAP-C, sensor feeds, telemetry) that cannot be encrypted as TDF but still require access control. | + +### Controller Services + +| Service | Description | +|---------|-------------| +| [OpenTDFControllerService](./nifi-tdf-controller-services-api/src/main/java/io/opentdf/nifi/OpenTDFControllerService.java) | Shared controller service providing OpenTDF Platform configuration (endpoint, OIDC credentials, KAS URL) to all TDF processors. | + +## What This Enables + +**Standard TDF flow** — encrypt data in NiFi and enforce policy downstream: + +``` +[Source] → [ConvertToZTDF] → [storage / transit] → [ConvertFromZTDF] → [Consumer] +``` + +**ABAC-only flow for binary protocols** — enforce policy without modifying payload bytes: + +``` +[JREAP-C source] → [ParseJREAPC] → [ABACEnforcement] → permit → [forward] + → deny → [drop / audit] + → failure → [error handling] +``` + +This pattern is the NiFi equivalent of the `GATEWAY_ABAC_ENCRYPT_EMAIL=0` mode in gateway deployments: attribute tagging and policy enforcement happen without wrapping the content as TDF. It applies wherever the payload format is fixed (Link 16/JREAP-C, sensor feeds, telemetry streams) and the flow requires access control without encryption. + +**Tag-only mode with `ConvertToZTDF`** — enforce policy and optionally encrypt in a single processor: + +``` +[Source] → [UpdateAttribute tdf_attribute=...] → [ConvertToZTDF Enable Encryption=false] → [ABACEnforcement] +``` + +## Using a Custom TrustStore -## Using a custom TrustStore Communicating over TLS with self-signed or other untrusted certs can be configured using NiFi's standard [SSL Context Service](https://nifi.apache.org/docs/nifi-docs/components/org.apache.nifi/nifi-ssl-context-service-nar/1.25.0/org.apache.nifi.ssl.StandardSSLContextService/index.html) -and then wired into the processors by setting their respective SSL Context Service properties to use a configured -SSL Context Service. +and wired into processors via their **SSL Context Service** property. -## Example +## Configuration To use these processors in NiFi: -* Configure the OpenTDFControllerService properties - * set then OpenTDF compliant endpoint - * set OIDC Client credentials (client id and client secret) - * set the data policy (UpdateAttribute Processor) - * set the KAS URL: ConvertToZTDF processor +1. Configure the **OpenTDFControllerService**: + - OpenTDF platform endpoint + - OIDC client credentials (client ID and client secret) +2. Wire the controller service into each processor's **OpenTDF Config Service** property +3. Set `tdf_attribute` on flow files (via `UpdateAttribute`, `ParseJREAPC`, or other means) with one or more OpenTDF attribute value FQNs in the format `https://namespace/attr/name/value/val` -#### FlowChart: Generic ZTDF Nifi Flows +#### FlowChart: Generic ZTDF NiFi Flows ![diagram](./docs/diagrams/generic_ztdf_nifi_flows.svg) +## Testing + +Unit and integration tests are included for all processors. Run the full suite with: + +```shell +export GITHUB_ACTOR=your-gh-username +export GITHUB_TOKEN=your-gh-token +mvn --batch-mode clean verify -s settings.xml +``` + +Test coverage includes: + +| Test class | What it covers | +|------------|----------------| +| `ParseJREAPCTest` | 32-byte header parsing: all classification codes, word types, exercise/simulation flags, `tdf_attribute` slug generation, content pass-through | +| `ABACEnforcementTest` | Permit, deny, fail-open/closed, missing/blank/empty `tdf_attribute`, default resource attributes, empty decision list, multi-decision deny-wins | +| `JREAPCPipelineTest` | Full pipeline chain: `ParseJREAPC` → `ABACEnforcement` → `ConvertToZTDF` with synthetic JREAP-C binary data, asserting byte-for-byte preservation through all stages | +| `ConvertToZTDFTest` | TDF wrapping, attribute binding, assertion signing | +| `ConvertFromZTDFTest` | TDF decryption routing | + # Quick Start - Docker Compose 1. Build the NiFi Archives (NARs) and place in the docker compose mounted volumes. The opentd diff --git a/nifi-jreapc-nar/pom.xml b/nifi-jreapc-nar/pom.xml new file mode 100644 index 0000000..bb2fb06 --- /dev/null +++ b/nifi-jreapc-nar/pom.xml @@ -0,0 +1,110 @@ + + + 4.0.0 + + + io.opentdf.nifi + nifi-pom + 0.10.0 + + nifi-jreapc-nar + nifi-jreapc-nar + NiFi JREAP-C Processor NAR Archive + + true + + nar + + + + ${project.groupId} + nifi-jreapc-processors + 0.10.0 + + + + + + + org.apache.nifi + nifi-nar-maven-plugin + + + org.apache.maven.plugins + maven-source-plugin + 3.3.1 + + + attach-sources + + jar + + + + + + org.apache.maven.plugins + maven-javadoc-plugin + 3.8.0 + + + attach-javadocs + + jar + + + + + + net.nicoulaj.maven.plugins + checksum-maven-plugin + 1.11 + + + create-checksums + package + + files + + + + MD5 + SHA-1 + SHA-256 + SHA-512 + + true + + + ${project.build.directory} + + *.nar + + + + + + + + + org.jacoco + jacoco-maven-plugin + + + default-prepare-agent + none + + + report-aggregate + none + + + default-report-aggregate + none + + + + + + diff --git a/nifi-jreapc-processors/pom.xml b/nifi-jreapc-processors/pom.xml new file mode 100644 index 0000000..7533c1d --- /dev/null +++ b/nifi-jreapc-processors/pom.xml @@ -0,0 +1,126 @@ + + + 4.0.0 + + + io.opentdf.nifi + nifi-pom + 0.10.0 + + nifi-jreapc-processors + nifi-jreapc-processors + NiFi processor for parsing JREAP-C binary protocol messages + jar + + + + org.apache.nifi + nifi-api + + + org.apache.nifi + nifi-utils + + + org.junit.jupiter + junit-jupiter-engine + test + + + org.apache.nifi + nifi-mock + test + + + org.slf4j + slf4j-api + + + com.fasterxml.jackson.core + jackson-databind + + + org.apache.commons + commons-text + + + org.apache.httpcomponents + httpcore + + + + + org.apache.logging.log4j + log4j-slf4j-impl + 2.17.2 + test + + + + + + + org.apache.maven.plugins + maven-source-plugin + 3.3.1 + + + attach-sources + + jar + + + + + + org.apache.maven.plugins + maven-javadoc-plugin + 3.8.0 + + UTF-8 + ${maven.compiler.source} + + + + attach-javadocs + + jar + + + + + + net.nicoulaj.maven.plugins + checksum-maven-plugin + 1.11 + + + create-checksums + package + + files + + + + MD5 + SHA-1 + SHA-256 + SHA-512 + + true + + + ${project.build.directory} + + *.jar + + + + + + + + + + diff --git a/nifi-jreapc-processors/src/main/java/io/opentdf/nifi/ParseJREAPC.java b/nifi-jreapc-processors/src/main/java/io/opentdf/nifi/ParseJREAPC.java new file mode 100644 index 0000000..a21e39d --- /dev/null +++ b/nifi-jreapc-processors/src/main/java/io/opentdf/nifi/ParseJREAPC.java @@ -0,0 +1,224 @@ +package io.opentdf.nifi; + +import org.apache.nifi.annotation.behavior.ReadsAttribute; +import org.apache.nifi.annotation.behavior.WritesAttribute; +import org.apache.nifi.annotation.behavior.WritesAttributes; +import org.apache.nifi.annotation.documentation.CapabilityDescription; +import org.apache.nifi.annotation.documentation.Tags; +import org.apache.nifi.components.PropertyDescriptor; +import org.apache.nifi.expression.ExpressionLanguageScope; +import org.apache.nifi.flowfile.FlowFile; +import org.apache.nifi.processor.AbstractProcessor; +import org.apache.nifi.processor.ProcessContext; +import org.apache.nifi.processor.ProcessSession; +import org.apache.nifi.processor.Relationship; +import org.apache.nifi.processor.exception.ProcessException; +import org.apache.nifi.processor.util.StandardValidators; + +import java.nio.ByteBuffer; +import java.nio.ByteOrder; +import java.time.Instant; +import java.time.ZoneOffset; +import java.time.format.DateTimeFormatter; +import java.util.Arrays; +import java.util.Collections; +import java.util.HashMap; +import java.util.HashSet; +import java.util.List; +import java.util.Map; +import java.util.Set; + +/** + * Parses binary JREAP-C (Joint Range Extension Applications Protocol Category C) + * messages and extracts policy-relevant header fields as NiFi flow file attributes. + * + * JREAP-C Header Format (32 bytes): + * [0-1] uint16 BE J-series word type + * [2] uint8 Security classification (0=U, 1=CUI, 2=S, 3=TS) + * [3] uint8 Flags (bit0=exercise, bit1=simulation) + * [4-7] uint32 BE Sequence number + * [8-15] 8 bytes Source address + * [16-23] 8 bytes Destination address + * [24-27] uint32 BE Timestamp (Unix seconds) + * [28-29] uint16 BE Track number + * [30-31] 2 bytes Reserved + * [32+] Payload + * + * The flow file content is passed through unmodified. Only attributes are added. + */ +@CapabilityDescription("Parses JREAP-C binary message headers and extracts classification, " + + "source/destination, track number, and word type as flow file attributes for downstream " + + "ABAC policy enforcement. The payload bytes are passed through unchanged.") +@Tags({"JREAP-C", "JREAP", "Link16", "TDL", "tactical", "ABAC", "classification", "parse", "DSP"}) +@WritesAttributes({ + @WritesAttribute(attribute = "jreapc.word_type", description = "J-series word type label (e.g. J3.0)"), + @WritesAttribute(attribute = "jreapc.word_type_code", description = "J-series word type hex code"), + @WritesAttribute(attribute = "jreapc.classification", description = "Security classification label (UNCLASSIFIED, CUI, SECRET, TOP SECRET)"), + @WritesAttribute(attribute = "jreapc.classification_code", description = "Raw classification byte value (0-3)"), + @WritesAttribute(attribute = "jreapc.exercise", description = "true if exercise flag is set"), + @WritesAttribute(attribute = "jreapc.simulation", description = "true if simulation flag is set"), + @WritesAttribute(attribute = "jreapc.sequence_number", description = "Message sequence number"), + @WritesAttribute(attribute = "jreapc.source_address", description = "Source node address (hex)"), + @WritesAttribute(attribute = "jreapc.destination_address", description = "Destination node address (hex)"), + @WritesAttribute(attribute = "jreapc.timestamp", description = "Message timestamp (ISO-8601 UTC)"), + @WritesAttribute(attribute = "jreapc.track_number", description = "Track number"), + @WritesAttribute(attribute = "jreapc.payload_size", description = "Payload size in bytes (after 32-byte header)"), +}) +public class ParseJREAPC extends AbstractProcessor { + + static final int HEADER_SIZE = 32; + + /** Map of J-series word type codes to human-readable labels. */ + private static final Map WORD_TYPE_LABELS; + static { + Map m = new HashMap<>(); + m.put(0x0300, "J3.0"); // Track Data + m.put(0x0304, "J3.4"); // EW Track Data + m.put(0x0500, "J5.0"); // Air Track + m.put(0x0504, "J5.4"); // Air Posture + m.put(0x0700, "J7.0"); // Surface Track + m.put(0x0900, "J9.0"); // Point of Interest + m.put(0x0C00, "J12.0"); // Mission Assignment + m.put(0x0D00, "J13.0"); // C2 Message + m.put(0x1100, "J17.0"); // Joint Engagement Sequence + m.put(0x1200, "J18.0"); // Correlation + m.put(0x1F00, "J31.0"); // Identification + WORD_TYPE_LABELS = Collections.unmodifiableMap(m); + } + + private static final String[] CLASSIFICATION_LABELS = { + "UNCLASSIFIED", "CUI", "SECRET", "TOP SECRET" + }; + + // ─── Relationships ──────────────────────────────────────────────────────── + + static final Relationship REL_SUCCESS = new Relationship.Builder() + .name("success") + .description("Flow file with parsed JREAP-C attributes") + .build(); + + static final Relationship REL_FAILURE = new Relationship.Builder() + .name("failure") + .description("Flow file that could not be parsed (too short, invalid header)") + .build(); + + // ─── Properties ────────────────────────────────────────────────────────── + + static final PropertyDescriptor CLASSIFICATION_ATTRIBUTE_NAMESPACE = new PropertyDescriptor.Builder() + .name("Classification Attribute Namespace") + .displayName("Classification Attribute Namespace") + .description("DSP attribute namespace FQN for classification mapping. " + + "When set, a 'tdf_attribute' flow file attribute is populated with the " + + "corresponding attribute value FQN so downstream ABACEnforcement can use it directly. " + + "Example: https://classification.example.org/attr/level") + .required(false) + .addValidator(StandardValidators.NON_BLANK_VALIDATOR) + .expressionLanguageSupported(ExpressionLanguageScope.ENVIRONMENT) + .build(); + + static final PropertyDescriptor FLOWFILE_PULL_SIZE = new PropertyDescriptor.Builder() + .name("FlowFile queue pull limit") + .description("FlowFile queue pull size limit") + .required(true) + .expressionLanguageSupported(ExpressionLanguageScope.ENVIRONMENT) + .defaultValue("10") + .addValidator(StandardValidators.INTEGER_VALIDATOR) + .build(); + + @Override + public Set getRelationships() { + return new HashSet<>(Arrays.asList(REL_SUCCESS, REL_FAILURE)); + } + + @Override + public List getSupportedPropertyDescriptors() { + return Arrays.asList(CLASSIFICATION_ATTRIBUTE_NAMESPACE, FLOWFILE_PULL_SIZE); + } + + @Override + public void onTrigger(ProcessContext ctx, ProcessSession session) throws ProcessException { + List flowFiles = session.get(ctx.getProperty(FLOWFILE_PULL_SIZE).evaluateAttributeExpressions().asInteger()); + if (flowFiles.isEmpty()) return; + + String classificationNs = ctx.getProperty(CLASSIFICATION_ATTRIBUTE_NAMESPACE).isSet() + ? ctx.getProperty(CLASSIFICATION_ATTRIBUTE_NAMESPACE).evaluateAttributeExpressions().getValue() + : null; + + for (FlowFile flowFile : flowFiles) { + if (flowFile.getSize() < HEADER_SIZE) { + getLogger().warn("FlowFile {} is too small ({} bytes) to be a valid JREAP-C message", + flowFile.getId(), flowFile.getSize()); + session.transfer(flowFile, REL_FAILURE); + continue; + } + + final byte[] header = new byte[HEADER_SIZE]; + session.read(flowFile, in -> in.read(header, 0, HEADER_SIZE)); + + try { + Map attrs = parseHeader(header, (int) flowFile.getSize() - HEADER_SIZE, + classificationNs); + flowFile = session.putAllAttributes(flowFile, attrs); + session.transfer(flowFile, REL_SUCCESS); + } catch (Exception e) { + getLogger().error("Failed to parse JREAP-C header for FlowFile {}", flowFile.getId(), e); + session.transfer(flowFile, REL_FAILURE); + } + } + } + + // ─── Package-private for testing ───────────────────────────────────────── + + Map parseHeader(byte[] header, int payloadSize, String classificationNs) { + ByteBuffer buf = ByteBuffer.wrap(header).order(ByteOrder.BIG_ENDIAN); + + int wordTypeCode = buf.getShort(0) & 0xFFFF; + int classCode = buf.get(2) & 0xFF; + int flags = buf.get(3) & 0xFF; + long seqNumber = buf.getInt(4) & 0xFFFFFFFFL; + byte[] srcAddr = Arrays.copyOfRange(header, 8, 16); + byte[] dstAddr = Arrays.copyOfRange(header, 16, 24); + long timestamp = buf.getInt(24) & 0xFFFFFFFFL; + int trackNumber = buf.getShort(28) & 0xFFFF; + + boolean exercise = (flags & 0x01) != 0; + boolean simulation = (flags & 0x02) != 0; + + String wordTypeLabel = WORD_TYPE_LABELS.getOrDefault(wordTypeCode, + String.format("J-UNKNOWN(0x%04X)", wordTypeCode)); + String classLabel = classCode < CLASSIFICATION_LABELS.length + ? CLASSIFICATION_LABELS[classCode] + : "UNKNOWN(" + classCode + ")"; + String isoTimestamp = Instant.ofEpochSecond(timestamp) + .atOffset(ZoneOffset.UTC) + .format(DateTimeFormatter.ISO_OFFSET_DATE_TIME); + + Map attrs = new HashMap<>(); + attrs.put("jreapc.word_type", wordTypeLabel); + attrs.put("jreapc.word_type_code", String.format("0x%04X", wordTypeCode)); + attrs.put("jreapc.classification", classLabel); + attrs.put("jreapc.classification_code", String.valueOf(classCode)); + attrs.put("jreapc.exercise", String.valueOf(exercise)); + attrs.put("jreapc.simulation", String.valueOf(simulation)); + attrs.put("jreapc.sequence_number", String.valueOf(seqNumber)); + attrs.put("jreapc.source_address", bytesToHex(srcAddr)); + attrs.put("jreapc.destination_address", bytesToHex(dstAddr)); + attrs.put("jreapc.timestamp", isoTimestamp); + attrs.put("jreapc.track_number", String.valueOf(trackNumber)); + attrs.put("jreapc.payload_size", String.valueOf(payloadSize)); + + // Auto-populate tdf_attribute from classification if namespace configured + if (classificationNs != null && !classificationNs.isBlank()) { + String valueSuffix = classLabel.toLowerCase().replace(" ", "_"); + attrs.put("tdf_attribute", classificationNs + "/value/" + valueSuffix); + } + + return attrs; + } + + private static String bytesToHex(byte[] bytes) { + StringBuilder sb = new StringBuilder("0x"); + for (byte b : bytes) sb.append(String.format("%02X", b)); + return sb.toString(); + } +} diff --git a/nifi-jreapc-processors/src/main/resources/META-INF/services/org.apache.nifi.processor.Processor b/nifi-jreapc-processors/src/main/resources/META-INF/services/org.apache.nifi.processor.Processor new file mode 100644 index 0000000..0763744 --- /dev/null +++ b/nifi-jreapc-processors/src/main/resources/META-INF/services/org.apache.nifi.processor.Processor @@ -0,0 +1 @@ +io.opentdf.nifi.ParseJREAPC diff --git a/nifi-jreapc-processors/src/test/java/io/opentdf/nifi/ParseJREAPCTest.java b/nifi-jreapc-processors/src/test/java/io/opentdf/nifi/ParseJREAPCTest.java new file mode 100644 index 0000000..1175633 --- /dev/null +++ b/nifi-jreapc-processors/src/test/java/io/opentdf/nifi/ParseJREAPCTest.java @@ -0,0 +1,177 @@ +package io.opentdf.nifi; + +import org.apache.nifi.util.MockFlowFile; +import org.apache.nifi.util.TestRunner; +import org.apache.nifi.util.TestRunners; +import org.junit.jupiter.api.BeforeEach; +import org.junit.jupiter.api.Test; + +import java.nio.ByteBuffer; +import java.nio.ByteOrder; +import java.util.List; +import java.util.Map; + +import static org.junit.jupiter.api.Assertions.*; + +class ParseJREAPCTest { + + private ParseJREAPC processor; + private TestRunner runner; + + @BeforeEach + void setup() { + processor = new ParseJREAPC(); + runner = TestRunners.newTestRunner(processor); + } + + // ─── parseHeader unit tests ─────────────────────────────────────────────── + + @Test + void parseHeader_secretClassification() { + byte[] header = buildHeader(0x0500, 2, 0x00, 42, new byte[8], new byte[8], 1700000000L, 99); + Map attrs = processor.parseHeader(header, 100, null); + + assertEquals("SECRET", attrs.get("jreapc.classification")); + assertEquals("2", attrs.get("jreapc.classification_code")); + assertEquals("J5.0", attrs.get("jreapc.word_type")); + assertEquals("0x0500", attrs.get("jreapc.word_type_code")); + assertEquals("42", attrs.get("jreapc.sequence_number")); + assertEquals("99", attrs.get("jreapc.track_number")); + assertEquals("100", attrs.get("jreapc.payload_size")); + assertFalse(Boolean.parseBoolean(attrs.get("jreapc.exercise"))); + assertFalse(Boolean.parseBoolean(attrs.get("jreapc.simulation"))); + assertNull(attrs.get("tdf_attribute"), "tdf_attribute not set when namespace is null"); + } + + @Test + void parseHeader_allClassificationLabels() { + String[] expected = {"UNCLASSIFIED", "CUI", "SECRET", "TOP SECRET"}; + for (int code = 0; code < 4; code++) { + byte[] header = buildHeader(0x0300, code, 0x00, 0, new byte[8], new byte[8], 0L, 0); + Map attrs = processor.parseHeader(header, 0, null); + assertEquals(expected[code], attrs.get("jreapc.classification"), "code=" + code); + } + } + + @Test + void parseHeader_unknownClassificationCode() { + byte[] header = buildHeader(0x0300, 9, 0x00, 0, new byte[8], new byte[8], 0L, 0); + Map attrs = processor.parseHeader(header, 0, null); + assertTrue(attrs.get("jreapc.classification").startsWith("UNKNOWN")); + } + + @Test + void parseHeader_exerciseAndSimulationFlags() { + byte[] header = buildHeader(0x0300, 0, 0x03, 0, new byte[8], new byte[8], 0L, 0); + Map attrs = processor.parseHeader(header, 0, null); + assertTrue(Boolean.parseBoolean(attrs.get("jreapc.exercise"))); + assertTrue(Boolean.parseBoolean(attrs.get("jreapc.simulation"))); + } + + @Test + void parseHeader_tdfAttributePopulatedWhenNamespaceSet() { + byte[] header = buildHeader(0x0500, 2, 0x00, 0, new byte[8], new byte[8], 0L, 0); + String ns = "https://classification.example.org/attr/level"; + Map attrs = processor.parseHeader(header, 0, ns); + assertEquals(ns + "/value/secret", attrs.get("tdf_attribute")); + } + + @Test + void parseHeader_topSecretTdfAttributeSlug() { + byte[] header = buildHeader(0x0300, 3, 0x00, 0, new byte[8], new byte[8], 0L, 0); + String ns = "https://ns.example/attr/level"; + Map attrs = processor.parseHeader(header, 0, ns); + assertEquals(ns + "/value/top_secret", attrs.get("tdf_attribute")); + } + + @Test + void parseHeader_unknownWordType() { + byte[] header = buildHeader(0xABCD, 0, 0x00, 0, new byte[8], new byte[8], 0L, 0); + Map attrs = processor.parseHeader(header, 0, null); + assertTrue(attrs.get("jreapc.word_type").startsWith("J-UNKNOWN")); + assertEquals("0xABCD", attrs.get("jreapc.word_type_code")); + } + + // ─── Processor integration tests ───────────────────────────────────────── + + @Test + void onTrigger_tooShortGoesToFailure() { + runner.enqueue(new byte[10]); // smaller than 32-byte header + runner.run(1); + + assertEquals(0, runner.getFlowFilesForRelationship(ParseJREAPC.REL_SUCCESS).size()); + assertEquals(1, runner.getFlowFilesForRelationship(ParseJREAPC.REL_FAILURE).size()); + } + + @Test + void onTrigger_validMessageGoesToSuccess() throws Exception { + byte[] payload = "tactical payload".getBytes(); + byte[] message = buildMessage(0x0700, 1, 0x00, 7, new byte[8], new byte[8], 1700000000L, 5, payload); + + runner.enqueue(message); + runner.run(1); + + List success = runner.getFlowFilesForRelationship(ParseJREAPC.REL_SUCCESS); + assertEquals(1, success.size()); + MockFlowFile ff = success.get(0); + ff.assertAttributeEquals("jreapc.classification", "CUI"); + ff.assertAttributeEquals("jreapc.word_type", "J7.0"); + ff.assertAttributeEquals("jreapc.payload_size", String.valueOf(payload.length)); + // Content is passed through unchanged + ff.assertContentEquals(message); + } + + @Test + void onTrigger_exactlyHeaderSize() { + byte[] message = buildMessage(0x0300, 0, 0x00, 1, new byte[8], new byte[8], 0L, 0, new byte[0]); + runner.enqueue(message); + runner.run(1); + + assertEquals(1, runner.getFlowFilesForRelationship(ParseJREAPC.REL_SUCCESS).size()); + runner.getFlowFilesForRelationship(ParseJREAPC.REL_SUCCESS).get(0) + .assertAttributeEquals("jreapc.payload_size", "0"); + } + + @Test + void onTrigger_withNamespaceProperty() { + runner.setProperty(ParseJREAPC.CLASSIFICATION_ATTRIBUTE_NAMESPACE, + "https://classification.example.org/attr/level"); + byte[] message = buildMessage(0x0500, 2, 0x00, 1, new byte[8], new byte[8], 0L, 0, new byte[0]); + runner.enqueue(message); + runner.run(1); + + MockFlowFile ff = runner.getFlowFilesForRelationship(ParseJREAPC.REL_SUCCESS).get(0); + ff.assertAttributeEquals("tdf_attribute", + "https://classification.example.org/attr/level/value/secret"); + } + + // ─── Helpers ───────────────────────────────────────────────────────────── + + /** Build a 32-byte JREAP-C header. */ + static byte[] buildHeader(int wordType, int classCode, int flags, + long seqNum, byte[] src, byte[] dst, + long timestamp, int trackNumber) { + ByteBuffer buf = ByteBuffer.allocate(ParseJREAPC.HEADER_SIZE).order(ByteOrder.BIG_ENDIAN); + buf.putShort((short) wordType); + buf.put((byte) classCode); + buf.put((byte) flags); + buf.putInt((int) seqNum); + buf.put(src, 0, 8); + buf.put(dst, 0, 8); + buf.putInt((int) timestamp); + buf.putShort((short) trackNumber); + buf.putShort((short) 0); // reserved + return buf.array(); + } + + /** Build a full JREAP-C message (header + payload). */ + static byte[] buildMessage(int wordType, int classCode, int flags, + long seqNum, byte[] src, byte[] dst, + long timestamp, int trackNumber, byte[] payload) { + byte[] header = buildHeader(wordType, classCode, flags, seqNum, src, dst, timestamp, trackNumber); + byte[] msg = new byte[header.length + payload.length]; + System.arraycopy(header, 0, msg, 0, header.length); + System.arraycopy(payload, 0, msg, header.length, payload.length); + return msg; + } +} diff --git a/nifi-tdf-processors/pom.xml b/nifi-tdf-processors/pom.xml index 0e0fe87..c80fe91 100644 --- a/nifi-tdf-processors/pom.xml +++ b/nifi-tdf-processors/pom.xml @@ -96,6 +96,20 @@ 1.14.17 test + + + org.apache.nifi + nifi-security-utils-api + ${nifi.version} + test + + + + ${project.groupId} + nifi-jreapc-processors + 0.10.0 + test + diff --git a/nifi-tdf-processors/src/main/java/io/opentdf/nifi/ABACEnforcement.java b/nifi-tdf-processors/src/main/java/io/opentdf/nifi/ABACEnforcement.java new file mode 100644 index 0000000..0d34688 --- /dev/null +++ b/nifi-tdf-processors/src/main/java/io/opentdf/nifi/ABACEnforcement.java @@ -0,0 +1,290 @@ +package io.opentdf.nifi; + +import io.opentdf.platform.authorization.AuthorizationServiceGrpc; +import io.opentdf.platform.authorization.DecisionRequest; +import io.opentdf.platform.authorization.DecisionResponse; +import io.opentdf.platform.authorization.Entity; +import io.opentdf.platform.authorization.EntityChain; +import io.opentdf.platform.authorization.GetDecisionsRequest; +import io.opentdf.platform.authorization.GetDecisionsResponse; +import io.opentdf.platform.authorization.ResourceAttribute; +import io.opentdf.platform.policy.Action; +import io.opentdf.platform.sdk.SDK; +import org.apache.nifi.annotation.behavior.ReadsAttribute; +import org.apache.nifi.annotation.behavior.ReadsAttributes; +import org.apache.nifi.annotation.behavior.WritesAttribute; +import org.apache.nifi.annotation.behavior.WritesAttributes; +import org.apache.nifi.annotation.documentation.CapabilityDescription; +import org.apache.nifi.annotation.documentation.Tags; +import org.apache.nifi.components.AllowableValue; +import org.apache.nifi.components.PropertyDescriptor; +import org.apache.nifi.expression.ExpressionLanguageScope; +import org.apache.nifi.flowfile.FlowFile; +import org.apache.nifi.processor.ProcessContext; +import org.apache.nifi.processor.ProcessSession; +import org.apache.nifi.processor.Relationship; +import org.apache.nifi.processor.exception.ProcessException; +import org.apache.nifi.processor.util.StandardValidators; + +import java.util.Arrays; +import java.util.Collections; +import java.util.HashSet; +import java.util.List; +import java.util.Set; +import java.util.concurrent.TimeUnit; + +/** + * Calls the OpenTDF Authorization Service (GetDecisions) to make an ABAC permit/deny + * decision for a flow file. Routes to "permit" or "deny" based on the response. + * + * The processor reads the {@code tdf_attribute} flow file attribute as a + * comma-separated list of OpenTDF resource attribute FQNs and submits them to the + * authorization service. Any upstream processor that populates {@code tdf_attribute} + * can feed into this processor — it is not tied to any specific protocol or format. + * + * Example flow: + * [Source] → [UpdateAttribute tdf_attribute=...] → [ABACEnforcement] → permit/deny/failure + */ +@CapabilityDescription("Calls the OpenTDF Authorization Service GetDecisions endpoint to make an " + + "ABAC permit/deny decision for the flow file. Reads resource attribute FQNs from the " + + "'tdf_attribute' flow file attribute and routes to 'permit', 'deny', or 'failure' " + + "based on the response. Works with any flow that sets tdf_attribute upstream.") +@Tags({"ABAC", "authorization", "OpenTDF", "policy", "enforcement", "permit", "deny", "access control"}) +@ReadsAttributes({ + @ReadsAttribute(attribute = "tdf_attribute", + description = "Comma-separated OpenTDF resource attribute value FQNs used as the " + + "resource context for the authorization decision. Required — flow files " + + "missing this attribute are routed to failure."), +}) +@WritesAttributes({ + @WritesAttribute(attribute = "abac.decision", + description = "PERMIT or DENY"), + @WritesAttribute(attribute = "abac.entity_id", + description = "The entity ID used in the authorization request"), + @WritesAttribute(attribute = "abac.resource_attributes", + description = "Comma-separated resource attribute FQNs that were evaluated"), + @WritesAttribute(attribute = "abac.processing_time_ms", + description = "Time taken for the GetDecisions call in milliseconds"), +}) +public class ABACEnforcement extends AbstractTDFProcessor { + + // ─── Relationships ──────────────────────────────────────────────────────── + + static final Relationship REL_PERMIT = new Relationship.Builder() + .name("permit") + .description("Authorization service returned DECISION_PERMIT") + .build(); + + static final Relationship REL_DENY = new Relationship.Builder() + .name("deny") + .description("Authorization service returned DECISION_DENY") + .build(); + + @Override + public Set getRelationships() { + return new HashSet<>(Arrays.asList(REL_PERMIT, REL_DENY, REL_FAILURE)); + } + + // ─── Properties ────────────────────────────────────────────────────────── + + static final AllowableValue ENTITY_TYPE_CLIENT_ID = new AllowableValue("CLIENT_ID", "Client ID"); + static final AllowableValue ENTITY_TYPE_EMAIL = new AllowableValue("EMAIL", "Email Address"); + static final AllowableValue ENTITY_TYPE_USERNAME = new AllowableValue("USERNAME", "Username"); + + static final PropertyDescriptor ENTITY_ID = new PropertyDescriptor.Builder() + .name("Entity ID") + .displayName("Entity ID") + .description("The entity (user, service account, or client) making the data access request. " + + "Used as the subject in the GetDecisions call. Supports Expression Language to " + + "read from flow file attributes (e.g. ${jwt.sub}).") + .required(true) + .expressionLanguageSupported(ExpressionLanguageScope.FLOWFILE_ATTRIBUTES) + .addValidator(StandardValidators.NON_BLANK_VALIDATOR) + .build(); + + static final PropertyDescriptor ENTITY_TYPE = new PropertyDescriptor.Builder() + .name("Entity Type") + .displayName("Entity Type") + .description("How to interpret the Entity ID value.") + .required(true) + .allowableValues(ENTITY_TYPE_CLIENT_ID, ENTITY_TYPE_EMAIL, ENTITY_TYPE_USERNAME) + .defaultValue("CLIENT_ID") + .build(); + + static final PropertyDescriptor DEFAULT_RESOURCE_ATTRIBUTES = new PropertyDescriptor.Builder() + .name("Default Resource Attribute FQNs") + .displayName("Default Resource Attribute FQNs") + .description("Comma-separated list of attribute value FQNs to use when the 'tdf_attribute' " + + "flow file attribute is not set. Leave blank to require tdf_attribute on every message. " + + "Example: https://classification.example.org/attr/level/value/secret") + .required(false) + .addValidator(StandardValidators.NON_BLANK_VALIDATOR) + .expressionLanguageSupported(ExpressionLanguageScope.ENVIRONMENT) + .build(); + + static final PropertyDescriptor DECISION_TIMEOUT_SECONDS = new PropertyDescriptor.Builder() + .name("Decision Timeout (seconds)") + .displayName("Decision Timeout (seconds)") + .description("Maximum time to wait for a GetDecisions response from the authorization service.") + .required(true) + .defaultValue("5") + .addValidator(StandardValidators.POSITIVE_INTEGER_VALIDATOR) + .build(); + + static final PropertyDescriptor FAIL_OPEN = new PropertyDescriptor.Builder() + .name("Fail Open") + .displayName("Fail Open on Authorization Error") + .description("When true, routes to 'permit' if the authorization service is unreachable " + + "or returns an error. When false (default), routes to 'failure'.") + .required(true) + .allowableValues("true", "false") + .defaultValue("false") + .build(); + + @Override + public List getSupportedPropertyDescriptors() { + List props = new java.util.ArrayList<>(super.getSupportedPropertyDescriptors()); + props.add(ENTITY_ID); + props.add(ENTITY_TYPE); + props.add(DEFAULT_RESOURCE_ATTRIBUTES); + props.add(DECISION_TIMEOUT_SECONDS); + props.add(FAIL_OPEN); + return Collections.unmodifiableList(props); + } + + // ─── Processor logic ───────────────────────────────────────────────────── + + @Override + void processFlowFiles(ProcessContext ctx, ProcessSession session, List flowFiles) + throws ProcessException { + + SDK sdk = getTDFSDK(ctx); + AuthorizationServiceGrpc.AuthorizationServiceFutureStub authStub = + sdk.getServices().authorization(); + + int timeoutSeconds = ctx.getProperty(DECISION_TIMEOUT_SECONDS).asInteger(); + Boolean failOpenVal = ctx.getProperty(FAIL_OPEN).asBoolean(); + if (failOpenVal == null) { + throw new ProcessException("Fail Open property did not resolve to 'true' or 'false'"); + } + boolean failOpen = failOpenVal; + + String defaultAttrFqns = ctx.getProperty(DEFAULT_RESOURCE_ATTRIBUTES).isSet() + ? ctx.getProperty(DEFAULT_RESOURCE_ATTRIBUTES).evaluateAttributeExpressions().getValue() + : null; + + for (FlowFile flowFile : flowFiles) { + long startMs = System.currentTimeMillis(); + try { + // Resolve entity ID + String entityId = ctx.getProperty(ENTITY_ID) + .evaluateAttributeExpressions(flowFile).getValue(); + String entityType = ctx.getProperty(ENTITY_TYPE).getValue(); + + // Resolve resource attributes from tdf_attribute or default + String attrFqnsCsv = flowFile.getAttribute("tdf_attribute"); + if (attrFqnsCsv == null || attrFqnsCsv.isBlank()) { + attrFqnsCsv = defaultAttrFqns; + } + if (attrFqnsCsv == null || attrFqnsCsv.isBlank()) { + throw new ProcessException("No resource attributes: set tdf_attribute on flow file " + + "or configure 'Default Resource Attribute FQNs'"); + } + + // Build entity + Entity.Builder entityBuilder = Entity.newBuilder().setId("entity-0"); + switch (entityType) { + case "EMAIL" -> entityBuilder.setEmailAddress(entityId); + case "USERNAME" -> entityBuilder.setUserName(entityId); + default -> entityBuilder.setClientId(entityId); + } + Entity entity = entityBuilder.build(); + + // Build entity chain + EntityChain entityChain = EntityChain.newBuilder() + .setId("ec-0") + .addEntities(entity) + .build(); + + // Build resource attributes + ResourceAttribute.Builder raBuilder = ResourceAttribute.newBuilder() + .setResourceAttributesId("ra-0"); + for (String fqn : attrFqnsCsv.split(",")) { + String trimmed = fqn.trim(); + if (!trimmed.isEmpty()) raBuilder.addAttributeValueFqns(trimmed); + } + if (raBuilder.getAttributeValueFqnsCount() == 0) { + throw new ProcessException("Resource attribute FQN list is empty after parsing: " + attrFqnsCsv); + } + ResourceAttribute resourceAttribute = raBuilder.build(); + + // Build action (TRANSMIT for data forwarding) + Action action = Action.newBuilder() + .setStandard(Action.StandardAction.STANDARD_ACTION_TRANSMIT) + .build(); + + // Build and fire GetDecisions request + DecisionRequest decisionRequest = DecisionRequest.newBuilder() + .addActions(action) + .addEntityChains(entityChain) + .addResourceAttributes(resourceAttribute) + .build(); + + GetDecisionsRequest request = GetDecisionsRequest.newBuilder() + .addDecisionRequests(decisionRequest) + .build(); + + GetDecisionsResponse response = authStub.getDecisions(request) + .get(timeoutSeconds, TimeUnit.SECONDS); + + long elapsedMs = System.currentTimeMillis() - startMs; + + // Evaluate decision — empty response is not a permit + if (response.getDecisionResponsesList().isEmpty()) { + throw new ProcessException("Authorization service returned no decisions"); + } + DecisionResponse.Decision overallDecision = DecisionResponse.Decision.DECISION_PERMIT; + for (DecisionResponse dr : response.getDecisionResponsesList()) { + if (dr.getDecision() != DecisionResponse.Decision.DECISION_PERMIT) { + overallDecision = DecisionResponse.Decision.DECISION_DENY; + break; + } + } + + String decisionLabel = overallDecision == DecisionResponse.Decision.DECISION_PERMIT + ? "PERMIT" : "DENY"; + + flowFile = session.putAttribute(flowFile, "abac.decision", decisionLabel); + flowFile = session.putAttribute(flowFile, "abac.entity_id", entityId); + flowFile = session.putAttribute(flowFile, "abac.resource_attributes", attrFqnsCsv); + flowFile = session.putAttribute(flowFile, "abac.processing_time_ms", + String.valueOf(elapsedMs)); + + getLogger().info("ABAC decision: {} | attrs={} | {}ms", + decisionLabel, attrFqnsCsv, elapsedMs); + getLogger().debug("ABAC subject: {}", entityId); + + Relationship rel = overallDecision == DecisionResponse.Decision.DECISION_PERMIT + ? REL_PERMIT : REL_DENY; + session.transfer(flowFile, rel); + + } catch (ProcessException pe) { + // Local validation failures (missing attributes, bad config) are never + // fail-open — unclassified or malformed flow files must not bypass policy. + getLogger().error("ABAC request validation failed for FlowFile {}: {}", + flowFile.getId(), pe.getMessage()); + session.transfer(flowFile, REL_FAILURE); + } catch (Exception e) { + // Remote call failures (network, timeout, service unavailable) respect failOpen. + getLogger().error("ABAC authorization call failed for FlowFile {}", flowFile.getId(), e); + if (failOpen) { + flowFile = session.putAttribute(flowFile, "abac.decision", "PERMIT"); + flowFile = session.putAttribute(flowFile, "abac.error", e.getMessage()); + session.transfer(flowFile, REL_PERMIT); + } else { + session.transfer(flowFile, REL_FAILURE); + } + } + } + } +} diff --git a/nifi-tdf-processors/src/main/java/io/opentdf/nifi/AbstractTDFProcessor.java b/nifi-tdf-processors/src/main/java/io/opentdf/nifi/AbstractTDFProcessor.java index ba69902..13ea444 100644 --- a/nifi-tdf-processors/src/main/java/io/opentdf/nifi/AbstractTDFProcessor.java +++ b/nifi-tdf-processors/src/main/java/io/opentdf/nifi/AbstractTDFProcessor.java @@ -43,7 +43,7 @@ public AbstractTDFProcessor() { .name("FlowFile queue pull limit") .description("FlowFile queue pull size limit") .required(true) - .expressionLanguageSupported(ExpressionLanguageScope.VARIABLE_REGISTRY) + .expressionLanguageSupported(ExpressionLanguageScope.ENVIRONMENT) .defaultValue("10") .addValidator(StandardValidators.INTEGER_VALIDATOR) .build(); @@ -192,7 +192,7 @@ byte[] readEntireFlowFile(FlowFile flowFile, ProcessSession processSession) { @Override public void onTrigger(ProcessContext processContext, ProcessSession processSession) throws ProcessException { - List flowFiles = processSession.get(processContext.getProperty(FLOWFILE_PULL_SIZE).asInteger()); + List flowFiles = processSession.get(processContext.getProperty(FLOWFILE_PULL_SIZE).evaluateAttributeExpressions().asInteger()); if (!flowFiles.isEmpty()) { processFlowFiles(processContext, processSession, flowFiles); } diff --git a/nifi-tdf-processors/src/main/java/io/opentdf/nifi/AbstractToProcessor.java b/nifi-tdf-processors/src/main/java/io/opentdf/nifi/AbstractToProcessor.java index 0a4367d..9362cd2 100644 --- a/nifi-tdf-processors/src/main/java/io/opentdf/nifi/AbstractToProcessor.java +++ b/nifi-tdf-processors/src/main/java/io/opentdf/nifi/AbstractToProcessor.java @@ -33,7 +33,7 @@ public AbstractToProcessor() { */ public static final PropertyDescriptor KAS_URL = new org.apache.nifi.components.PropertyDescriptor.Builder() .name("KAS URL") - .expressionLanguageSupported(ExpressionLanguageScope.VARIABLE_REGISTRY) + .expressionLanguageSupported(ExpressionLanguageScope.ENVIRONMENT) .description("The KAS Url to use for encryption; this is a default if the kas_url attribute is not present in the flow file") .required(false) .addValidator(StandardValidators.NON_BLANK_VALIDATOR) diff --git a/nifi-tdf-processors/src/main/java/io/opentdf/nifi/ConvertToZTDF.java b/nifi-tdf-processors/src/main/java/io/opentdf/nifi/ConvertToZTDF.java index d55ea56..a6044f8 100644 --- a/nifi-tdf-processors/src/main/java/io/opentdf/nifi/ConvertToZTDF.java +++ b/nifi-tdf-processors/src/main/java/io/opentdf/nifi/ConvertToZTDF.java @@ -60,23 +60,28 @@ public ConvertToZTDF() { } /** - * Property descriptor for the "Sign Assertions" feature in the ConvertToZTDF processor. This property allows specifying whether - * the assertions should be signed or not. It is not a required property and defaults to "false". - *

- * - Name: Sign Assertions - * - Description: sign assertions - * - Required: false - * - Default Value: false - * - Allowable Values: true, false - * - Expression Language Supported: {@link ExpressionLanguageScope#VARIABLE_REGISTRY} + * Property descriptor for the "Enable Encryption" feature in the ConvertToZTDF processor. + * When false, the flow file passes through without TDF encryption (tag-only / ABAC-only mode). + * Required, defaults to "true". Supports ENVIRONMENT-scoped expression language. */ + public static final PropertyDescriptor ENABLE_ENCRYPTION = new org.apache.nifi.components.PropertyDescriptor.Builder() + .name("Enable Encryption") + .description("When false, the flow file passes through without TDF encryption. " + + "Use this for ABAC policy enforcement only (tag-only mode), mirroring the " + + "GATEWAY_ABAC_ENCRYPT_EMAIL=0 behavior in the Virtru Gateway.") + .required(true) + .defaultValue("true") + .allowableValues("true", "false") + .expressionLanguageSupported(ExpressionLanguageScope.ENVIRONMENT) + .build(); + public static final PropertyDescriptor SIGN_ASSERTIONS = new org.apache.nifi.components.PropertyDescriptor.Builder() .name("Sign Assertions") .description("sign assertions") .required(false) .defaultValue("false") .allowableValues("true", "false") - .expressionLanguageSupported(ExpressionLanguageScope.VARIABLE_REGISTRY) + .expressionLanguageSupported(ExpressionLanguageScope.ENVIRONMENT) .build(); /** @@ -94,7 +99,7 @@ public ConvertToZTDF() { .required(true) .identifiesControllerService(PrivateKeyService.class) .dependsOn(SIGN_ASSERTIONS, new AllowableValue("true")) - .expressionLanguageSupported(ExpressionLanguageScope.VARIABLE_REGISTRY) + .expressionLanguageSupported(ExpressionLanguageScope.ENVIRONMENT) .build(); @@ -118,6 +123,7 @@ PrivateKeyService getPrivateKeyService(ProcessContext processContext) { @Override public List getSupportedPropertyDescriptors() { List propertyDescriptors = new ArrayList<>(super.getSupportedPropertyDescriptors()); + propertyDescriptors.add(ENABLE_ENCRYPTION); propertyDescriptors.add(PRIVATE_KEY_CONTROLLER_SERVICE); propertyDescriptors.add(SIGN_ASSERTIONS); return Collections.unmodifiableList(propertyDescriptors); @@ -195,6 +201,11 @@ private void populateFieldFromMap(Map sourceMap, String key, Map des */ @Override void processFlowFiles(ProcessContext processContext, ProcessSession processSession, List flowFiles) throws ProcessException { + Boolean encryptionEnabled = processContext.getProperty(ENABLE_ENCRYPTION).evaluateAttributeExpressions().asBoolean(); + if (encryptionEnabled == null) { + throw new ProcessException("Enable Encryption property did not resolve to 'true' or 'false'"); + } + SDK sdk = getTDFSDK(processContext); for (final FlowFile flowFile : flowFiles) { try { @@ -203,6 +214,9 @@ void processFlowFiles(ProcessContext processContext, ProcessSession processSessi //build baseline TDF Config options List> configurationOptions = new ArrayList<>(Arrays.asList(Config.withKasInformation(kasInfoList.toArray(new Config.KASInfo[0])), Config.withDataAttributes(dataAttributes.toArray(new String[0])))); + if (!encryptionEnabled) { + configurationOptions.add(cfg -> cfg.enableEncryption = false); + } List nifiAssertionAttributeKeys = flowFile.getAttributes().keySet().stream().filter(x->x.startsWith(TDF_ASSERTION_PREFIX)).toList(); for(String nifiAssertionAttributeKey: nifiAssertionAttributeKeys) { getLogger().debug(String.format("Adding assertion for NiFi attribute = %s", nifiAssertionAttributeKey)); diff --git a/nifi-tdf-processors/src/main/java/io/opentdf/nifi/SimpleOpenTDFControllerService.java b/nifi-tdf-processors/src/main/java/io/opentdf/nifi/SimpleOpenTDFControllerService.java index 74212c9..7fd17ca 100644 --- a/nifi-tdf-processors/src/main/java/io/opentdf/nifi/SimpleOpenTDFControllerService.java +++ b/nifi-tdf-processors/src/main/java/io/opentdf/nifi/SimpleOpenTDFControllerService.java @@ -37,7 +37,7 @@ public SimpleOpenTDFControllerService() { .name("platform-endpoint") .displayName("OpenTDF Platform ENDPOINT") .required(true) - .expressionLanguageSupported(ExpressionLanguageScope.VARIABLE_REGISTRY) + .expressionLanguageSupported(ExpressionLanguageScope.ENVIRONMENT) .addValidator(StandardValidators.NON_EMPTY_VALIDATOR) .sensitive(false) .description("OpenTDF Platform ENDPOINT in GRPC compatible format (no protocol prefix)") @@ -55,7 +55,7 @@ public SimpleOpenTDFControllerService() { .name("clientSecret") .displayName("Client Secret") .required(true) - .expressionLanguageSupported(ExpressionLanguageScope.VARIABLE_REGISTRY) + .expressionLanguageSupported(ExpressionLanguageScope.ENVIRONMENT) .addValidator(StandardValidators.NON_EMPTY_VALIDATOR) .sensitive(true) .description("OpenTDF Platform Authentication Client Secret") @@ -70,7 +70,7 @@ public SimpleOpenTDFControllerService() { .name("clientId") .displayName("Client ID") .required(true) - .expressionLanguageSupported(ExpressionLanguageScope.VARIABLE_REGISTRY) + .expressionLanguageSupported(ExpressionLanguageScope.ENVIRONMENT) .addValidator(StandardValidators.NON_EMPTY_VALIDATOR) .sensitive(false) .description("OpenTDF Platform Authentication Client ID") diff --git a/nifi-tdf-processors/src/main/resources/META-INF/services/org.apache.nifi.processor.Processor b/nifi-tdf-processors/src/main/resources/META-INF/services/org.apache.nifi.processor.Processor index 79acb08..6fc52e5 100644 --- a/nifi-tdf-processors/src/main/resources/META-INF/services/org.apache.nifi.processor.Processor +++ b/nifi-tdf-processors/src/main/resources/META-INF/services/org.apache.nifi.processor.Processor @@ -1,2 +1,3 @@ io.opentdf.nifi.ConvertFromZTDF io.opentdf.nifi.ConvertToZTDF +io.opentdf.nifi.ABACEnforcement diff --git a/nifi-tdf-processors/src/test/java/io/opentdf/nifi/ABACEnforcementTest.java b/nifi-tdf-processors/src/test/java/io/opentdf/nifi/ABACEnforcementTest.java new file mode 100644 index 0000000..43bc0f0 --- /dev/null +++ b/nifi-tdf-processors/src/test/java/io/opentdf/nifi/ABACEnforcementTest.java @@ -0,0 +1,230 @@ +package io.opentdf.nifi; + +import com.google.common.util.concurrent.Futures; +import io.opentdf.platform.authorization.AuthorizationServiceGrpc; +import io.opentdf.platform.authorization.DecisionResponse; +import io.opentdf.platform.authorization.GetDecisionsRequest; +import io.opentdf.platform.authorization.GetDecisionsResponse; +import io.opentdf.platform.sdk.SDK; +import org.apache.nifi.processor.ProcessContext; +import org.apache.nifi.util.MockFlowFile; +import org.apache.nifi.util.TestRunner; +import org.apache.nifi.util.TestRunners; +import org.junit.jupiter.api.BeforeEach; +import org.junit.jupiter.api.Test; + +import java.util.List; +import java.util.Map; +import java.util.concurrent.TimeoutException; + +import static org.junit.jupiter.api.Assertions.*; +import static org.mockito.ArgumentMatchers.any; +import static org.mockito.Mockito.*; + +class ABACEnforcementTest { + + private static final String ENTITY_ID_VALUE = "nifi-service-account"; + private static final String TDF_ATTR_FQN = "https://ns.example.mil/attr/classification/value/secret"; + private static final byte[] PAYLOAD = "tactical message content".getBytes(); + + // ─── Mock inner class ───────────────────────────────────────────────────── + + static class MockABACEnforcement extends ABACEnforcement { + SDK mockSDK; + @Override + SDK getTDFSDK(ProcessContext ctx) { return mockSDK; } + } + + // ─── Fixtures ───────────────────────────────────────────────────────────── + + private MockABACEnforcement processor; + private TestRunner runner; + private AuthorizationServiceGrpc.AuthorizationServiceFutureStub mockAuthStub; + + @BeforeEach + void setup() throws Exception { + processor = new MockABACEnforcement(); + + SDK mockSDK = mock(SDK.class); + SDK.Services mockServices = mock(SDK.Services.class); + mockAuthStub = mock(AuthorizationServiceGrpc.AuthorizationServiceFutureStub.class); + when(mockSDK.getServices()).thenReturn(mockServices); + when(mockServices.authorization()).thenReturn(mockAuthStub); + processor.mockSDK = mockSDK; + + runner = TestRunners.newTestRunner(processor); + Utils.setupTDFControllerService(runner); + runner.setProperty(ABACEnforcement.ENTITY_ID, ENTITY_ID_VALUE); + runner.setProperty(ABACEnforcement.ENTITY_TYPE, "CLIENT_ID"); + } + + // ─── PERMIT / DENY ─────────────────────────────────────────────────────── + + @Test + void permit_routesToPermitWithDecisionAttribute() { + stubAuthResponse(DecisionResponse.Decision.DECISION_PERMIT); + + runner.enqueue(PAYLOAD, Map.of("tdf_attribute", TDF_ATTR_FQN)); + runner.run(1); + + List permitted = runner.getFlowFilesForRelationship(ABACEnforcement.REL_PERMIT); + assertEquals(1, permitted.size()); + assertEquals(0, runner.getFlowFilesForRelationship(ABACEnforcement.REL_DENY).size()); + assertEquals(0, runner.getFlowFilesForRelationship(ABACEnforcement.REL_FAILURE).size()); + + MockFlowFile ff = permitted.get(0); + assertEquals("PERMIT", ff.getAttribute("abac.decision")); + assertEquals(ENTITY_ID_VALUE, ff.getAttribute("abac.entity_id")); + assertEquals(TDF_ATTR_FQN, ff.getAttribute("abac.resource_attributes")); + assertNotNull(ff.getAttribute("abac.processing_time_ms")); + // Binary content must pass through unchanged + assertArrayEquals(PAYLOAD, ff.toByteArray()); + } + + @Test + void deny_routesToDenyWithDecisionAttribute() { + stubAuthResponse(DecisionResponse.Decision.DECISION_DENY); + + runner.enqueue(PAYLOAD, Map.of("tdf_attribute", TDF_ATTR_FQN)); + runner.run(1); + + List denied = runner.getFlowFilesForRelationship(ABACEnforcement.REL_DENY); + assertEquals(1, denied.size()); + assertEquals(0, runner.getFlowFilesForRelationship(ABACEnforcement.REL_PERMIT).size()); + assertEquals("DENY", denied.get(0).getAttribute("abac.decision")); + assertArrayEquals(PAYLOAD, denied.get(0).toByteArray()); + } + + @Test + void anyDenyInMultipleDecisions_overallDeny() { + when(mockAuthStub.getDecisions(any(GetDecisionsRequest.class))) + .thenReturn(Futures.immediateFuture(GetDecisionsResponse.newBuilder() + .addDecisionResponses(decisionOf(DecisionResponse.Decision.DECISION_PERMIT)) + .addDecisionResponses(decisionOf(DecisionResponse.Decision.DECISION_DENY)) + .build())); + + runner.enqueue(PAYLOAD, Map.of("tdf_attribute", TDF_ATTR_FQN)); + runner.run(1); + + assertEquals(1, runner.getFlowFilesForRelationship(ABACEnforcement.REL_DENY).size()); + } + + // ─── Validation failures (fail-closed regardless of Fail Open setting) ─── + + @Test + void missingTdfAttribute_noDefault_routesToFailure() { + // No tdf_attribute on flow file, no Default Resource Attribute FQNs configured + runner.enqueue(PAYLOAD); + runner.run(1); + + assertEquals(1, runner.getFlowFilesForRelationship(ABACEnforcement.REL_FAILURE).size()); + assertEquals(0, runner.getFlowFilesForRelationship(ABACEnforcement.REL_PERMIT).size()); + verifyNoInteractions(mockAuthStub); + } + + @Test + void blankTdfAttribute_noDefault_routesToFailure() { + runner.enqueue(PAYLOAD, Map.of("tdf_attribute", " ")); + runner.run(1); + + assertEquals(1, runner.getFlowFilesForRelationship(ABACEnforcement.REL_FAILURE).size()); + verifyNoInteractions(mockAuthStub); + } + + @Test + void allBlankFqnsAfterSplit_routesToFailure() { + // All entries are blank after splitting — empty FQN list must never default-permit + runner.enqueue(PAYLOAD, Map.of("tdf_attribute", " , , ")); + runner.run(1); + + assertEquals(1, runner.getFlowFilesForRelationship(ABACEnforcement.REL_FAILURE).size()); + verifyNoInteractions(mockAuthStub); + } + + @Test + void emptyDecisionList_routesToFailure() { + // Auth service returns a response with no decisions — ambiguous, treat as failure + when(mockAuthStub.getDecisions(any(GetDecisionsRequest.class))) + .thenReturn(Futures.immediateFuture(GetDecisionsResponse.newBuilder().build())); + + runner.enqueue(PAYLOAD, Map.of("tdf_attribute", TDF_ATTR_FQN)); + runner.run(1); + + assertEquals(1, runner.getFlowFilesForRelationship(ABACEnforcement.REL_FAILURE).size()); + } + + // ─── Default resource attributes ───────────────────────────────────────── + + @Test + void defaultResourceAttributes_usedWhenTdfAttributeAbsent() { + stubAuthResponse(DecisionResponse.Decision.DECISION_PERMIT); + runner.setProperty(ABACEnforcement.DEFAULT_RESOURCE_ATTRIBUTES, TDF_ATTR_FQN); + + // Flow file has no tdf_attribute — should fall back to the default + runner.enqueue(PAYLOAD); + runner.run(1); + + assertEquals(1, runner.getFlowFilesForRelationship(ABACEnforcement.REL_PERMIT).size()); + assertEquals(TDF_ATTR_FQN, + runner.getFlowFilesForRelationship(ABACEnforcement.REL_PERMIT) + .get(0).getAttribute("abac.resource_attributes")); + } + + @Test + void tdfAttributeOverridesDefault_whenBothSet() { + stubAuthResponse(DecisionResponse.Decision.DECISION_PERMIT); + String defaultAttr = "https://ns.example.mil/attr/classification/value/unclassified"; + runner.setProperty(ABACEnforcement.DEFAULT_RESOURCE_ATTRIBUTES, defaultAttr); + + String flowFileAttr = TDF_ATTR_FQN; + runner.enqueue(PAYLOAD, Map.of("tdf_attribute", flowFileAttr)); + runner.run(1); + + MockFlowFile ff = runner.getFlowFilesForRelationship(ABACEnforcement.REL_PERMIT).get(0); + assertEquals(flowFileAttr, ff.getAttribute("abac.resource_attributes"), + "Flow file tdf_attribute must take precedence over default"); + } + + // ─── Remote call failure / Fail Open ───────────────────────────────────── + + @Test + void authServiceException_failClosedDefault_routesToFailure() { + when(mockAuthStub.getDecisions(any(GetDecisionsRequest.class))) + .thenThrow(new RuntimeException("gRPC channel closed")); + + runner.enqueue(PAYLOAD, Map.of("tdf_attribute", TDF_ATTR_FQN)); + runner.run(1); + + assertEquals(1, runner.getFlowFilesForRelationship(ABACEnforcement.REL_FAILURE).size()); + assertEquals(0, runner.getFlowFilesForRelationship(ABACEnforcement.REL_PERMIT).size()); + } + + @Test + void authServiceTimeout_failOpen_routesToPermit() { + when(mockAuthStub.getDecisions(any(GetDecisionsRequest.class))) + .thenReturn(Futures.immediateFailedFuture(new TimeoutException("upstream timeout"))); + runner.setProperty(ABACEnforcement.FAIL_OPEN, "true"); + + runner.enqueue(PAYLOAD, Map.of("tdf_attribute", TDF_ATTR_FQN)); + runner.run(1); + + List permitted = runner.getFlowFilesForRelationship(ABACEnforcement.REL_PERMIT); + assertEquals(1, permitted.size()); + assertEquals("PERMIT", permitted.get(0).getAttribute("abac.decision")); + assertNotNull(permitted.get(0).getAttribute("abac.error"), + "abac.error must be set on fail-open permit"); + } + + // ─── Helpers ───────────────────────────────────────────────────────────── + + private void stubAuthResponse(DecisionResponse.Decision decision) { + when(mockAuthStub.getDecisions(any(GetDecisionsRequest.class))) + .thenReturn(Futures.immediateFuture(GetDecisionsResponse.newBuilder() + .addDecisionResponses(decisionOf(decision)) + .build())); + } + + private static DecisionResponse decisionOf(DecisionResponse.Decision decision) { + return DecisionResponse.newBuilder().setDecision(decision).build(); + } +} diff --git a/nifi-tdf-processors/src/test/java/io/opentdf/nifi/JREAPCPipelineTest.java b/nifi-tdf-processors/src/test/java/io/opentdf/nifi/JREAPCPipelineTest.java new file mode 100644 index 0000000..7c6ff4d --- /dev/null +++ b/nifi-tdf-processors/src/test/java/io/opentdf/nifi/JREAPCPipelineTest.java @@ -0,0 +1,291 @@ +package io.opentdf.nifi; + +import com.google.common.util.concurrent.Futures; +import io.opentdf.platform.authorization.AuthorizationServiceGrpc; +import io.opentdf.platform.authorization.DecisionResponse; +import io.opentdf.platform.authorization.GetDecisionsRequest; +import io.opentdf.platform.authorization.GetDecisionsResponse; +import io.opentdf.platform.sdk.SDK; +import io.opentdf.platform.sdk.TDF; +import org.apache.nifi.processor.ProcessContext; +import org.apache.nifi.util.MockFlowFile; +import org.apache.nifi.util.TestRunner; +import org.apache.nifi.util.TestRunners; +import org.junit.jupiter.api.Test; + +import java.nio.ByteBuffer; +import java.nio.ByteOrder; +import java.util.HashMap; +import java.util.List; +import java.util.Map; + +import static org.junit.jupiter.api.Assertions.*; +import static org.mockito.ArgumentMatchers.any; +import static org.mockito.Mockito.*; + +/** + * End-to-end pipeline test: ParseJREAPC → ABACEnforcement → ConvertToZTDF. + * + *

Verifies that JREAP-C binary content is preserved byte-for-byte through + * the full classification-tagging and ABAC enforcement pipeline before the TDF + * wrapping step receives it. Two scenarios are covered: + *

    + *
  • PERMIT — message flows through all three stages; TDF receives original bytes.
  • + *
  • DENY — message is stopped at ABACEnforcement and never reaches ConvertToZTDF.
  • + *
+ */ +class JREAPCPipelineTest { + + // ─── JREAP-C binary builder (32-byte big-endian header + payload) ───────── + + private static final int HEADER_SIZE = 32; + + /** + * Builds a synthetic JREAP-C message with the given word-type, classification + * code, and arbitrary payload bytes appended after the fixed header. + * + *

Header layout (big-endian): + *

+     *  offset  len  field
+     *   0       2   word type
+     *   2       1   classification code  (0=UNCLASSIFIED, 1=CUI, 2=SECRET, 3=TOP SECRET)
+     *   3       1   flags                (bit0=exercise, bit1=simulation)
+     *   4       4   sequence number
+     *   8       8   source address
+     *  16       8   destination address
+     *  24       4   timestamp (UNIX, 32-bit)
+     *  28       2   track number
+     *  30       2   reserved
+     * 
+ */ + static byte[] buildJreapCMessage(int wordType, int classCode, int seqNum, + long timestamp, int trackNumber, byte[] payload) { + ByteBuffer buf = ByteBuffer.allocate(HEADER_SIZE + payload.length) + .order(ByteOrder.BIG_ENDIAN); + buf.putShort((short) wordType); + buf.put((byte) classCode); + buf.put((byte) 0x00); // flags + buf.putInt(seqNum); + buf.put(new byte[8]); // source address (zeroed) + buf.put(new byte[8]); // dest address (zeroed) + buf.putInt((int) timestamp); + buf.putShort((short) trackNumber); + buf.putShort((short) 0); // reserved + buf.put(payload); + return buf.array(); + } + + // ─── Mock inner classes ─────────────────────────────────────────────────── + + static class MockABACEnforcement extends ABACEnforcement { + SDK mockSDK; + @Override + SDK getTDFSDK(ProcessContext ctx) { return mockSDK; } + } + + static class MockConvertToZTDF extends ConvertToZTDF { + SDK mockSDK; + TDF mockTDF; + @Override + SDK getTDFSDK(ProcessContext ctx) { return mockSDK; } + @Override + TDF getTDF() { return mockTDF; } + } + + // ─── Tests ──────────────────────────────────────────────────────────────── + + /** + * Happy-path: SECRET JREAP-C message is parsed, ABAC returns PERMIT, + * and ConvertToZTDF receives the original binary unchanged. + */ + @Test + void secretMessage_permitDecision_binaryPreservedThroughFullPipeline() throws Exception { + // Build a realistic JREAP-C message: J5.0 word type, SECRET, with a short payload + byte[] tacticalPayload = {0x01, 0x02, 0x03, 0x04, 0x05, 0x06, 0x07, 0x08}; + byte[] jreapMsg = buildJreapCMessage( + 0x0500, // J5.0 word type + 2, // classification code 2 = SECRET + 42, // sequence number + 1700000000L, // timestamp + 7, // track number + tacticalPayload + ); + + // ── Stage 1: ParseJREAPC ───────────────────────────────────────────── + ParseJREAPC parseProcessor = new ParseJREAPC(); + TestRunner parseRunner = TestRunners.newTestRunner(parseProcessor); + parseRunner.setProperty(ParseJREAPC.CLASSIFICATION_ATTRIBUTE_NAMESPACE, + "https://classification.example.mil/attr/level"); + + parseRunner.enqueue(jreapMsg); + parseRunner.run(1); + + List parsedFiles = parseRunner.getFlowFilesForRelationship(ParseJREAPC.REL_SUCCESS); + assertEquals(1, parsedFiles.size(), "ParseJREAPC must route to success"); + MockFlowFile parsedFF = parsedFiles.get(0); + + // Verify classification was extracted and tdf_attribute was set + assertEquals("SECRET", parsedFF.getAttribute("jreapc.classification")); + assertEquals("https://classification.example.mil/attr/level/value/secret", + parsedFF.getAttribute("tdf_attribute")); + assertEquals("J5.0", parsedFF.getAttribute("jreapc.word_type")); + assertEquals("42", parsedFF.getAttribute("jreapc.sequence_number")); + + // Binary must be unchanged at this stage + assertArrayEquals(jreapMsg, parsedFF.toByteArray(), + "ParseJREAPC must not modify binary content"); + + // ── Stage 2: ABACEnforcement (mocked: PERMIT) ──────────────────────── + MockABACEnforcement abacProcessor = new MockABACEnforcement(); + + SDK mockAbacSDK = mock(SDK.class); + SDK.Services mockAbacServices = mock(SDK.Services.class); + AuthorizationServiceGrpc.AuthorizationServiceFutureStub mockAuthStub = + mock(AuthorizationServiceGrpc.AuthorizationServiceFutureStub.class); + when(mockAbacSDK.getServices()).thenReturn(mockAbacServices); + when(mockAbacServices.authorization()).thenReturn(mockAuthStub); + when(mockAuthStub.getDecisions(any(GetDecisionsRequest.class))) + .thenReturn(Futures.immediateFuture(GetDecisionsResponse.newBuilder() + .addDecisionResponses(DecisionResponse.newBuilder() + .setDecision(DecisionResponse.Decision.DECISION_PERMIT)) + .build())); + abacProcessor.mockSDK = mockAbacSDK; + + TestRunner abacRunner = TestRunners.newTestRunner(abacProcessor); + Utils.setupTDFControllerService(abacRunner); + abacRunner.setProperty(ABACEnforcement.ENTITY_ID, "nifi-pipeline-service"); + abacRunner.setProperty(ABACEnforcement.ENTITY_TYPE, "CLIENT_ID"); + + // Carry all attributes and content forward from ParseJREAPC output + Map parsedAttrs = new HashMap<>(parsedFF.getAttributes()); + abacRunner.enqueue(parsedFF.toByteArray(), parsedAttrs); + abacRunner.run(1); + + List permittedFiles = abacRunner.getFlowFilesForRelationship(ABACEnforcement.REL_PERMIT); + assertEquals(1, permittedFiles.size(), "ABACEnforcement must route SECRET to permit"); + assertEquals(0, abacRunner.getFlowFilesForRelationship(ABACEnforcement.REL_DENY).size()); + + MockFlowFile permittedFF = permittedFiles.get(0); + assertEquals("PERMIT", permittedFF.getAttribute("abac.decision")); + assertEquals("https://classification.example.mil/attr/level/value/secret", + permittedFF.getAttribute("abac.resource_attributes")); + + // Binary still unchanged after ABAC decision + assertArrayEquals(jreapMsg, permittedFF.toByteArray(), + "ABACEnforcement must not modify binary content"); + + // ── Stage 3: ConvertToZTDF (mocked: capture bytes fed to TDF) ──────── + MockConvertToZTDF tdfProcessor = new MockConvertToZTDF(); + + SDK mockTdfSDK = mock(SDK.class); + SDK.Services mockTdfServices = mock(SDK.Services.class); + SDK.KAS mockKAS = mock(SDK.KAS.class); + TDF mockTDF = mock(TDF.class); + when(mockTdfSDK.getServices()).thenReturn(mockTdfServices); + when(mockTdfServices.kas()).thenReturn(mockKAS); + tdfProcessor.mockSDK = mockTdfSDK; + tdfProcessor.mockTDF = mockTDF; + + // Capture the exact bytes ConvertToZTDF passes to TDF.createTDF() + final byte[][] capturedInputBytes = {null}; + doAnswer(inv -> { + java.io.InputStream is = inv.getArgument(0); + java.io.OutputStream os = inv.getArgument(1); + capturedInputBytes[0] = is.readAllBytes(); + os.write(("WRAPPED:" + capturedInputBytes[0].length + "b").getBytes()); + return null; + }).when(mockTDF).createTDF(any(), any(), any(), any(), any()); + + TestRunner tdfRunner = TestRunners.newTestRunner(tdfProcessor); + Utils.setupTDFControllerService(tdfRunner); + tdfRunner.setProperty(ConvertToZTDF.KAS_URL, "https://kas.example.mil"); + + Map permittedAttrs = new HashMap<>(permittedFF.getAttributes()); + tdfRunner.enqueue(permittedFF.toByteArray(), permittedAttrs); + tdfRunner.run(1); + + assertEquals(1, tdfRunner.getFlowFilesForRelationship(ConvertToZTDF.REL_SUCCESS).size(), + "ConvertToZTDF must succeed"); + + // ── Key assertion: TDF receives the original JREAP-C binary, byte-for-byte ── + assertNotNull(capturedInputBytes[0], "TDF.createTDF must have been called"); + assertArrayEquals(jreapMsg, capturedInputBytes[0], + "ConvertToZTDF must feed the original JREAP-C binary to the TDF library unchanged"); + } + + /** + * Deny-path: SECRET JREAP-C message is parsed but ABAC returns DENY — + * the message must be stopped before reaching ConvertToZTDF. + */ + @Test + void secretMessage_denyDecision_neverReachesTdf() throws Exception { + byte[] jreapMsg = buildJreapCMessage(0x0500, 2, 1, 1700000000L, 3, + "sensitive tactical data".getBytes()); + + // Stage 1: ParseJREAPC + ParseJREAPC parseProcessor = new ParseJREAPC(); + TestRunner parseRunner = TestRunners.newTestRunner(parseProcessor); + parseRunner.setProperty(ParseJREAPC.CLASSIFICATION_ATTRIBUTE_NAMESPACE, + "https://classification.example.mil/attr/level"); + parseRunner.enqueue(jreapMsg); + parseRunner.run(1); + MockFlowFile parsedFF = parseRunner.getFlowFilesForRelationship(ParseJREAPC.REL_SUCCESS).get(0); + + // Stage 2: ABACEnforcement — DENY + MockABACEnforcement abacProcessor = new MockABACEnforcement(); + SDK mockAbacSDK = mock(SDK.class); + SDK.Services mockAbacServices = mock(SDK.Services.class); + AuthorizationServiceGrpc.AuthorizationServiceFutureStub mockAuthStub = + mock(AuthorizationServiceGrpc.AuthorizationServiceFutureStub.class); + when(mockAbacSDK.getServices()).thenReturn(mockAbacServices); + when(mockAbacServices.authorization()).thenReturn(mockAuthStub); + when(mockAuthStub.getDecisions(any(GetDecisionsRequest.class))) + .thenReturn(Futures.immediateFuture(GetDecisionsResponse.newBuilder() + .addDecisionResponses(DecisionResponse.newBuilder() + .setDecision(DecisionResponse.Decision.DECISION_DENY)) + .build())); + abacProcessor.mockSDK = mockAbacSDK; + + TestRunner abacRunner = TestRunners.newTestRunner(abacProcessor); + Utils.setupTDFControllerService(abacRunner); + abacRunner.setProperty(ABACEnforcement.ENTITY_ID, "unauthorized-client"); + abacRunner.setProperty(ABACEnforcement.ENTITY_TYPE, "CLIENT_ID"); + + abacRunner.enqueue(parsedFF.toByteArray(), new HashMap<>(parsedFF.getAttributes())); + abacRunner.run(1); + + // Message must be on the deny relationship — not permit, not failure + assertEquals(1, abacRunner.getFlowFilesForRelationship(ABACEnforcement.REL_DENY).size()); + assertEquals(0, abacRunner.getFlowFilesForRelationship(ABACEnforcement.REL_PERMIT).size()); + assertEquals(0, abacRunner.getFlowFilesForRelationship(ABACEnforcement.REL_FAILURE).size()); + + MockFlowFile deniedFF = abacRunner.getFlowFilesForRelationship(ABACEnforcement.REL_DENY).get(0); + assertEquals("DENY", deniedFF.getAttribute("abac.decision")); + + // The denied flow file would NOT be fed to ConvertToZTDF — no wrapping stage reached. + // Binary content is still intact on the deny branch (for audit/quarantine downstream). + assertArrayEquals(jreapMsg, deniedFF.toByteArray(), + "Binary content must be intact on the deny branch"); + } + + /** + * Top-secret message: classification code 3 → tdf_attribute slug is top_secret. + */ + @Test + void topSecretMessage_tdfAttributeSlug_isTopSecret() throws Exception { + byte[] jreapMsg = buildJreapCMessage(0x0300, 3, 1, 1700000000L, 0, new byte[]{0x55}); + + ParseJREAPC parseProcessor = new ParseJREAPC(); + TestRunner parseRunner = TestRunners.newTestRunner(parseProcessor); + parseRunner.setProperty(ParseJREAPC.CLASSIFICATION_ATTRIBUTE_NAMESPACE, + "https://classification.example.mil/attr/level"); + parseRunner.enqueue(jreapMsg); + parseRunner.run(1); + + MockFlowFile parsedFF = parseRunner.getFlowFilesForRelationship(ParseJREAPC.REL_SUCCESS).get(0); + assertEquals("TOP SECRET", parsedFF.getAttribute("jreapc.classification")); + assertEquals("https://classification.example.mil/attr/level/value/top_secret", + parsedFF.getAttribute("tdf_attribute")); + assertArrayEquals(jreapMsg, parsedFF.toByteArray()); + } +} diff --git a/nifi-tdf-processors/src/test/java/io/opentdf/nifi/Utils.java b/nifi-tdf-processors/src/test/java/io/opentdf/nifi/Utils.java index b505c18..65ce79d 100644 --- a/nifi-tdf-processors/src/test/java/io/opentdf/nifi/Utils.java +++ b/nifi-tdf-processors/src/test/java/io/opentdf/nifi/Utils.java @@ -12,6 +12,7 @@ public class Utils { static void setupTDFControllerService(TestRunner runner) throws Exception { + runner.setValidateExpressionUsage(false); SimpleOpenTDFControllerService tdfControllerService = new SimpleOpenTDFControllerService(); Map controllerPropertyMap = new HashMap<>(); controllerPropertyMap.put(PLATFORM_ENDPOINT.getName(), "http://platform"); diff --git a/pom.xml b/pom.xml index 9f34e47..c9e9ab6 100644 --- a/pom.xml +++ b/pom.xml @@ -29,11 +29,11 @@ https://github.com/opentdf/nifi/tree/main/ - 1.23.1 + 2.7.0 5.10.0 - 17 - 17 - 17 + 21 + 21 + 21 .7 @@ -41,6 +41,8 @@ nifi-tdf-controller-services-api-nar nifi-tdf-processors nifi-tdf-nar + nifi-jreapc-processors + nifi-jreapc-nar @@ -78,13 +80,13 @@ org.mockito mockito-core - 5.2.0 + 5.14.0 test org.mockito mockito-junit-jupiter - 5.2.0 + 5.14.0 test @@ -105,7 +107,7 @@ org.apache.nifi nifi-nar-maven-plugin - 1.5.1 + 2.0.0 true @@ -256,6 +258,10 @@ 1 + + false
@@ -398,6 +404,7 @@ 1 ${argLine} + false