- PrivaceraCloud Release 7.4
- Enhancements and updates in PrivaceraCloud release 7.4
- Known Issues in PrivaceraCloud 7.4
- PrivaceraCloud User Guide
- Overview of PrivaceraCloud
- Connect applications with the setup wizard
- Connect applications
- About applications
- Connect Azure Data Lake Storage Gen 2 (ADLS) to PrivaceraCloud
- Connect Amazon Textract to PrivaceraCloud
- Athena
- Privacera Discovery with Cassandra
- Connect Databricks to PrivaceraCloud
- Databricks SQL
- Databricks SQL Overview and Configuration
- Planning and general process
- Prerequisites
- Databricks SQL with Privacera Hive
- Connect Databricks SQL application
- Grant Databricks SQL permissions to PrivaceraCloud users
- Define a resource policy
- Test the policy
- Databricks SQL PolicySync fields
- Configuring column-level access control
- View-based masking functions and row-level filtering
- Create an endpoint in Databricks SQL
- Databricks SQL Fields
- Databricks SQL Hive Service Definition
- Databricks SQL Masking Functions
- Databricks SQL Encryption
- Use a custom policy repository with Databricks
- Connect Databricks SQL to Hive policy repository on PrivaceraCloud
- Databricks SQL Overview and Configuration
- Connect Databricks Unity Catalog to PrivaceraCloud
- Connect S3 to PrivaceraCloud
- Prerequisites in AWS console
- Connect S3 application to PrivaceraCloud
- Enable Privacera Access Management for S3
- Enable Data Discovery for S3
- S3 AWS Commands - Ranger Permission Mapping
- S3
- AWS Access with IAM
- Access AWS S3 buckets from multiple AWS accounts
- Add UserInfo in S3 Requests sent via Dataserver
- Control access to S3 buckets with AWS Lambda function on PrivaceraCloud
- Dremio Plugin
- DynamoDB
- Connect Elastic MapReduce from Amazon application to PrivaceraCloud
- Connect EMR application
- EMR Spark access control types
- PrivaceraCloud configuration
- AWS IAM roles using CloudFormation setup
- Create a security configuration
- Create EMR cluster
- How to configure multiple JSON Web Tokens (JWTs) for EMR
- EMR Native Ranger Integration with PrivaceraCloud
- Connect EMRFS S3 to PrivaceraCloud
- Files
- GBQ
- Google Cloud Storage
- Connect Glue to PrivaceraCloud
- Google BigQuery for PolicySync
- Connect Kinesis to PrivaceraCloud
- Connect Lambda to PrivaceraCloud
- Microsoft SQL Server
- MySQL for Discovery
- Open Source Apache Spark
- Oracle for Discovery
- PostgreSQL
- Connect Power BI to PrivaceraCloud
- Presto
- Redshift
- Snowflake
- Starburst Enterprise with PrivaceraCloud
- Starburst Enterprise Presto
- Trino
- Connect users
- Data access Users, Groups, and Roles
- UserSync
- Portal user LDAP/AD
- Datasource
- Okta Setup for SAML-SSO
- Azure AD setup
- SCIM Server User-Provisioning
- User Management
- Identity
- Access Manager
- Access Manager
- Resource Policies
- Tag Policies
- Scheme Policies
- Service Explorer
- Reports
- Audit
- About data access users, groups, and roles resource policies
- Security zones
- Discovery
- Classifications via random sampling
- Privacera Discovery scan targets
- Propagate Privacera Discovery Tags to Ranger
- Enable offline scanning on Azure Data Lake Storage Gen 2 (ADLS)
- Enable Real-time Scanning of S3 Buckets
- Enable Real-time Scanning on Azure Data Lake Storage Gen 2 (ADLS)
- Enable Discovery Realtime Scanning Using IAM Role
- Encryption
- Overview of Privacera Encryption
- Encryption schemes
- Presentation schemes
- Masking schemes
- Create scheme policies
- Privacera-supplied encryption schemes for the Privacera API
- Privacera-supplied encryption schemes for the Bouncy Castle API
- API date input formats
- Deprecated encryption formats, algorithms, and scopes
- Privacera Encryption REST API
- PEG API endpoint
- PEG REST API encryption endpoints
- Prerequisites
- Common PEG REST API fields
- Construct the datalist for the /protect endpoint
- Deconstruct the response from the /unprotect endpoint
- Example data transformation with the /unprotect endpoint and presentation scheme
- Example PEG API endpoints
- Make encryption API calls on behalf of another user
- Privacera Encryption UDF for masking in Databricks on PrivaceraCloud
- Privacera Encryption UDFs for Trino on PrivaceraCloud
- Syntax of Privacera Encryption UDFs for Trino
- Prerequisites for installing Privacera Crypto plug-in for Trino
- Download and install Privacera Crypto jar
- Set variables in Trino etc/crypto.properties
- Restart Trino to register the Privacera encryption and masking UDFs for Trino
- Example queries to verify Privacera-supplied UDFs
- Privacera Encryption UDF for masking in Trino on PrivaceraCloud
- Encryption UDFs for Apache Spark on PrivaceraCloud
- Launch Pad
- Settings
- Dashboard
- Usage statistics
- Operational status of PrivaceraCloud and RSS feed
- How to Get Support
- Coordinated Vulnerability Disclosure (CVD) Program of Privacera
- Shared Security Model
- PrivaceraCloud Previews
- Preview: File Explorer for S3
- Preview: File Explorer for Azure
- Preview: File Explorer for GCS
- Preview: Scan Generic Records with NER Model
- Preview: Scan Electronic Health Records with NER Model
- Preview: OneLogin setup for SAML-SSO
- Preview: Azure Active Directory SCIM Server UserSync
- Preview: OneLogin UserSync
- Preview: PingFederate UserSync
- Quickstart for Databricks Unity Catalog on PrivaceraCloud
- What do I need to do in my Databricks Workspace?
- Where is the sample dataset in my Databricks Workspace?
- What should I do in the PrivaceraCloud web portal?
- Access use-case - How do I give a user access to a table or restrict from running a SQL select query?
- Access use-case - How do I restrict a user from seeing contents of a column in the result of a SQL select query?
- Column masking use-case - How do I restrict a user from seeing contents of a column by masking the values in the result of a SQL select query?
- Access use-case - How do I disallow a user from seeing certain rows of a table?
- PrivaceraCloud documentation changelog
Preview: File Explorer for S3
This feature is a preview.
Note
Contact Privacera Support to request enabling this feature.
You can browse AWS S3 buckets and their associated files and folders using the File Explorer for AWS S3 under your application name tabs. You can use the bucket to upload and retrieve data. You can control access by granting or denying permissions to users, groups, or roles through the resource policies associated with the bucket.
Prerequisites
Be sure you have added Data access methods for AWS IAM Roles only.
Note
To use the AWS S3 signed URL, you must add the following property in the Application Properties > Custom Properties for the dataserver data resource. The property value can be any string.
dataserver.shared.secret=<provide_any_value>
Set up Cross-Origin Resource Sharing (CORS) on your S3 bucket. See AWS documentation on how to set up CORS. Use the following permission JSON to set up CORS:
[ { "AllowedHeaders": [ "*" ], "AllowedMethods": [ "PUT", "GET" ], "AllowedOrigins": [ "https://privaceracloud.com" ], "ExposeHeaders": [] } ]
Connect S3 Application
To access an AWS S3, connect the S3 application. For more information, see S3.
Modify Resource Policy
Navigate to Access Manager > Resource Policies. Click the privacera_s3 repo. It will display the list of policies defined in the repo.
Modify the default policy or create a new one. For more information on creating a policy, see Resource Policies.
In Bucker Name, add a bucket name to browse a specific bucket. To add all the buckets, add * in the field.
In Object Path, add an object path to browse the resources stored in the path. To add all the object paths, add * in the field.
In the Allow Condition section, do the following:
Under Permissions, click Add Permissions and select metadata read.
Under Select Group, select public.
File Explorer
To view the list of files and folders of a S3 bucket, do the following:
Navigate to Data Inventory > File Explorer.
Since the policy is enabled, all the S3 buckets under the application names tabs are displayed.
On the File Explorer page, you can do the following actions:
Action
Description
Refresh
Refreshes the S3 buckets
Search
Search for a particular bucket
Filter
Hides or shows columns
Create Folder
Creates a folder
Upload
Uploads a file.
Delete
Deletes files or folders
Calculate
Calculates folder size
Copy to Clipboard
Copies the object path
Example
You can control access to the S3 bucket by managing the permissions in the policy. In this use case, you'll learn how to allow/restrict a user from uploading files to an S3 bucket, as well as how to monitor the activity in an audit log.
Create a policy with Read and Write permissions.
To upload a file, perform the following steps:
Navigate to Data Inventory > File Explorer.
Go to the S3 bucket where you want to upload the file.
Click Upload.
Add File popup is displayed.
Choose the file to upload and click Upload button.
The file is uploaded with a success message and seen in the listing.
Navigate to Access Manager > Audits to view the audit log for the upload action.
Edit the policy and remove the Write permission.
To upload a file, perform the following steps:
Navigate to Data Inventory > File Explorer.
Click the tab of your application name, and select the S3 bucket where you want to upload the file.
Click Upload, Add File popup is displayed.
Choose the file to upload and click Upload button.
"Access Denied" error message is displayed.
Navigate to Access Manager > Audits to view the audit log for the denied upload action.