Skip to main content

User Guide

Role Permissions

Superusers can create custom Roles for Privitar Users. A custom Role consists of a custom combination of permission. These are assigned on a per-Team basis. This means that the permissions are only valid for the specific Teams a given User has this Role for. For more information on how to create and edit Roles, refer to Managing Roles.

The following table describes what type of permissions to perform certain actions on specific Privitar objects can be assigned to a given Role.

Object

Action

Description

Schemas

Create

Create new Schemas describing the input data.

Edit

Edit (that is, change) existing Schemas describing the input data. Note that editing a Schema could invalidate existing Policies and Jobs using it.

Delete

Delete existing Schemas describing the input data. Note that deleting a Schema will invalidate existing Policies and Jobs using it.

Policies

Create

Create new Policies. That is, define which de-identification tranformations (Rules and Generalization strategies) to apply to a given Schema.

Edit

Edit (that is, change) existing de-identification Policies. Note that editing a Policy could affect and invalidate existing Jobs and Protected Data Domains using it.

Delete

Delete existing de-identification Policies. Note that deleting a Policy could affect and invalidate existing Jobs and Protected Data Domains using it.

Rules

Create

Create new Rules. That is, define specific de-identification tranformations (masking approaches) that can be used in Policies. Note that Rules, unlike other Privitar objects, are shared with and can be used by any Team, even if the User who created it does not belong to these.

Edit

Edit existing Rules created by the given User's Team. Note that editing a Rule could affect and invalidate existing Policies, Jobs and Protected Data Domains that use it, including in other Teams.

Delete

Delete existing Rules created by the given User's Team. Note that deleting a Rule will affect and invalidate existing Policies, Jobs and Protected Data Domains that use it, including in other Teams.

Masking Jobs

Create

Create new masking (that is, de-identification), Jobs of any type (Batch/Hadoop, Data Flow and Privitar on Demand Jobs) on an already defined Environment, according to a given privacy Policy. Note that running the defined masking Job requires additional permissions

Edit

Edit existing masking Jobs of any type (Batch/Hadoop, Data Flow and Privitar on Demand Jobs). Note that editing a Job could invalidate or change its behavior for new data that gets processed by it.

Delete

Delete existing masking Jobs of any type (Batch/Hadoop, Data Flow and Privitar on Demand Jobs). Note that deleting an existing Data Flow or Privitar on Demand Job will break the corresponding data pipeline.

Cancel

Cancel (that is, terminate early) Batch/Hadoop masking Jobs that are still in progress of processing.

Run Batch

Run (that is, start the processing of) defined Batch masking Jobs(i.e. de-identification Jobs on an Hadoop cluster).

Run Data Flow

Permit the execution of Data Flow masking Jobs. That is, Jobs on a data streaming processor such as NiFi, Kafka/Confluent or StreamSets). This permission grants Data Flow permissions that are required on user credentials used in a Data Flow pipeline configuration using either basic HTTP authentication or Mutual TLS authentication.

Run POD

Permit the execution of Privitar On Demand (POD) Masking Jobs. POD Masking jobs can only be run by API users using Mutual TLS authentication, so this permission is only applicable for API users.

Unmasking Jobs

Create

Create new unmasking Jobs for previously de-identified data files on an Environment. Running the Unmasking Job requires additional permissions.

Edit

Edit existing Unmasking Jobs. Note that editing a Job could invalidate or change its behavior for new data that gets processed by it.

Delete

Delete existing Unmasking Jobs.

Cancel

Cancel (that is, terminate early), Unmasking Jobs that are still in progress of processing.

Run Batch

Run (that is, start the processing of), defined unmasking (Batch) Jobs (that is, re-identification Jobs on an Hadoop cluster) on a given file. Note that there is a separate permission for unmasking single tokens rather than entire files (see below in Protected Data)

Run Data Flow

Permit the execution of Unmasking Jobs on a data streaming processor such as Apache NiFi, Apache Kafka/Confluent, or StreamSets.

Run POD

Permit the execution of Privitar On Demand (POD) Unmasking Jobs. POD Unmasking jobs can only be run by API Users using Mutual TLS authentication, so this permission is only applicable for API Users.

Protected Data

Create

Create new Protected Data Domains (PDDs) in an existing Environment.

Edit

Edit existing Protected Data Domains (PDDs). Note that during this process previously created PDD Metadata might be overwritten.

Delete

Delete closed Protected Data Domains (PDDs). Note that this will delete the PDD metadata and Job history associated with it, but not the output data (except the tokens in the Token Vault).

Close

Close (that is, lock), Protected Data Domains (PDDs). Closing a PDD will make it 'read-only' and prevent any new data from being added. The record of tokens (that is, the Token Vault) produced during Job runs in this PDD is discarded. Closing a PDD also prevents unmasking.

Unmask Token

Unmask (that is, re-identify to the original/raw value), a single token value of a de-identified field. This operation is only possible on a Hadoop environment. Note that there are separate permissions for unmasking entire files (see Unmasking Jobs above).

Run Unvelier

Permit the execution of SecureLink Unveiler operations over Protected Data Domains (PDDs). Unveiler operations can only be run by API users using Mutual TLS authentication, so this permission is only applicable for API Users.

Run Remasking

Permit the execution of ReMasking operations over Protected Data Domains (PDDs). Remasking operations can only be run by API Users using Mutual TLS authentication, so this permission is only applicable for API Users.

Remove Token Mapping

Permit the removal of a token mapping in a Token Vault in order to support a Right to be Removed request. This operation can only be run by API Users using Mutual TLS authentication, so this permission is only applicable for API Users.

Environments

Create

Create new Environments, including supplying Hadoop Cluster details and managing token vaults, secret keys, Privitar on Demand settings as well as SecureLink configurations.

Edit

Edit configurations of existing Environments, incl. Hadoop Cluster details and managing token vaults, secret keys and encryption, Privitar on Demand settings as well as SecureLink configurations. Note that changes could affect and invalidate other Privitar objects such as Jobs and Protected Data Domains, that use the given Environment.

Delete

Delete Environments. Note that deleting an Environment will affect and invalidate other Privitar objects such as Jobs and Protected Data Domains, that use that Environment.

Test

Test the Environment configuration. This test only applies to Hadoop Cluster Environments.

Match Watermark

Investigate a file (in an Environment owned by your Team) and match a Watermark against it. This operation is only available through Batch Jobs on Hadoop Clusters.