- Platform Release 6.5
- Privacera Platform Installation
- About Privacera Manager (PM)
- Install overview
- Prerequisites
- Installation
- Default services configuration
- Component services configurations
- Access Management
- Data Server
- PolicySync
- Snowflake
- Redshift
- Redshift Spectrum
- PostgreSQL
- Microsoft SQL Server
- Databricks SQL
- RocksDB
- Google BigQuery
- Power BI
- UserSync
- Privacera Plugin
- Databricks
- Spark standalone
- Spark on EKS
- Portal SSO with PingFederate
- Trino Open Source
- Dremio
- AWS EMR
- AWS EMR with Native Apache Ranger
- GCP Dataproc
- Starburst Enterprise
- Privacera services (Data Assets)
- Audit Fluentd
- Grafana
- Ranger Tagsync
- Discovery
- Encryption & Masking
- Privacera Encryption Gateway (PEG) and Cryptography with Ranger KMS
- AWS S3 bucket encryption
- Ranger KMS
- AuthZ / AuthN
- Security
- Access Management
- Reference - Custom Properties
- Validation
- Additional Privacera Manager configurations
- CLI actions
- Debugging and logging
- Advanced service configuration
- Increase Privacera portal timeout for large requests
- Order of precedence in PolicySync filter
- Configure system properties
- PolicySync
- Databricks
- Table properties
- Upgrade Privacera Manager
- Troubleshooting
- How to validate installation
- Possible Errors and Solutions in Privacera Manager
- Unable to Connect to Docker
- Terminate Installation
- 6.5 Platform Installation fails with invalid apiVersion
- Ansible Kubernetes Module does not load
- Unable to connect to Kubernetes Cluster
- Common Errors/Warnings in YAML Config Files
- Delete old unused Privacera Docker images
- Unable to debug error for an Ansible task
- Unable to upgrade from 4.x to 5.x or 6.x due to Zookeeper snapshot issue
- Storage issue in Privacera UserSync & PolicySync
- Permission Denied Errors in PM Docker Installation
- Unable to initialize the Discovery Kubernetes pod
- Portal service
- Grafana service
- Audit server
- Audit Fluentd
- Privacera Plugin
- How-to
- Appendix
- AWS topics
- AWS CLI
- AWS IAM
- Configure S3 for real-time scanning
- Install Docker and Docker compose (AWS-Linux-RHEL)
- AWS S3 MinIO quick setup
- Cross account IAM role for Databricks
- Integrate Privacera services in separate VPC
- Securely access S3 buckets ssing IAM roles
- Multiple AWS account support in Dataserver using Databricks
- Multiple AWS S3 IAM role support in Dataserver
- Azure topics
- GCP topics
- Kubernetes
- Microsoft SQL topics
- Snowflake configuration for PolicySync
- Create Azure resources
- Databricks
- Spark Plug-in
- Azure key vault
- Add custom properties
- Migrate Ranger KMS master key
- IAM policy for AWS controller
- Customize topic and table names
- Configure SSL for Privacera
- Configure Real-time scan across projects in GCP
- Upload custom SSL certificates
- Deployment size
- Service-level system properties
- PrestoSQL standalone installation
- AWS topics
- Privacera Platform User Guide
- Introduction to Privacera Platform
- Settings
- Data inventory
- Token generator
- System configuration
- Diagnostics
- Notifications
- How-to
- Privacera Discovery User Guide
- What is Discovery?
- Discovery Dashboard
- Scan Techniques
- Processing order of scan techniques
- Add and scan resources in a data source
- Start or cancel a scan
- Tags
- Dictionaries
- Patterns
- Scan status
- Data zone movement
- Models
- Disallowed Tags policy
- Rules
- Types of rules
- Example rules and classifications
- Create a structured rule
- Create an unstructured rule
- Create a rule mapping
- Export rules and mappings
- Import rules and mappings
- Post-processing in real-time and offline scans
- Enable post-processing
- Example of post-processing rules on tags
- List of structured rules
- Supported scan file formats
- Data Source Scanning
- Data Inventory
- TagSync using Apache Ranger
- Compliance Workflow
- Data zones and workflow policies
- Workflow Policies
- Alerts Dashboard
- Data Zone Dashboard
- Data zone movement
- Workflow policy use case example
- Discovery Health Check
- Reports
- How-to
- Privacera Encryption Guide
- Overview of Privacera Encryption
- Install Privacera Encryption
- Encryption Key Management
- Schemes
- Encryption with PEG REST API
- Privacera Encryption REST API
- PEG API endpoint
- PEG REST API encryption endpoints
- PEG REST API authentication methods on Privacera Platform
- Common PEG REST API fields
- Construct the datalist for the /protect endpoint
- Deconstruct the response from the /unprotect endpoint
- Example data transformation with the /unprotect endpoint and presentation scheme
- Example PEG API endpoints
- /authenticate
- /protect with encryption scheme
- /protect with masking scheme
- /protect with both encryption and masking schemes
- /unprotect without presentation scheme
- /unprotect with presentation scheme
- /unprotect with masking scheme
- REST API response partial success on bulk operations
- Audit details for PEG REST API accesses
- Make encryption API calls on behalf of another user
- Troubleshoot REST API Issues on Privacera Platform
- Privacera Encryption REST API
- Encryption with Databricks, Hive, Streamsets, Trino
- Databricks UDFs for encryption and masking
- Hive UDFs
- StreamSets Data Collector (SDC) and Privacera Encryption
- Trino UDFs for encryption and masking
- Privacera Access Management User Guide
- Privacera Access Management
- How Polices are evaluated
- Resource policies
- Policies overview
- Creating Resource Based Policies
- Configure Policy with Attribute-Based Access Control
- Configuring Policy with Conditional Masking
- Tag Policies
- Entitlement
- Service Explorer
- Users, groups, and roles
- Permissions
- Reports
- Audit
- Security Zone
- Access Control using APIs
- AWS User Guide
- Overview of Privacera on AWS
- Set policies for AWS services
- Using Athena with data access server
- Using DynamoDB with data access server
- Databricks access manager policy
- Accessing Kinesis with data access server
- Accessing Firehose with Data Access Server
- EMR user guide
- AWS S3 bucket encryption
- Getting started with Minio
- Plugins
- How to Get Support
- Coordinated Vulnerability Disclosure (CVD) Program of Privacera
- Shared Security Model
- Privacera Platform documentation changelog
Configure AWS RDS PostgreSQL instance for access audits
You can configure your AWS account to allow Privacera to access your RDS PostgreSQL instance audit logs through Amazon cloudWatch logs. To enable this functionality, you must make the following changes in your account:
Update the AWS RDS parameter group for the database
Create an AWS SQS queue
Specify an AWS Lambda function
Create an IAM role for an EC2 instance
Update the AWS RDS parameter group for the database
To expose access audit logs, you must update configuration for the data source.
Procedure
To create a role for audits, run the following SQL query with a user with administrative credentials for your data source:
CREATE ROLE rds_pgaudit;
Create a new parameter group for your database and specify the following values:
Parameter group family: Select a database from either the
aurora-postgresql
orpostgres
families.Type: Select DB Parameter Group.
Group name: Specify a group name for the parameter group.
Description: Specify a description for the parameter group.
Edit the parameter group that you created in the previous step and set the following values:
pgaudit.log
: Specifyall
, overwriting any existing value.shared_preload_libraries
: Specifypg_stat_statements,pgaudit
.pgaudit.role
: Specifyrds_pgaudit
.
Associate the parameter group that you created with your database. Modify the configuration for the database instance and make the following changes:
DB parameter group: Specify the parameter group you created in this procedure.
PostgreSQL log: Ensure this option is set to enable logging to Amazon cloudWatch logs.
When prompted, choose the option to immediately apply the changes you made in the previous step.
Restart the database instance.
Verification
To verify that your database instance logs are available, complete the following steps:
From the Amazon RDS console, View the logs for your database instance from the RDS console.
From the CloudWatch console, complete the following steps:
Find the
/aws/rds/cluster/*
log group that corresponds to your database instance.Click the log group name to confirm that a log stream exists for the database instance, and then click on a log stream name to confirm that log messages are present.
Create an AWS SQS queue
To create an SQS queue used by an AWS Lambda function that you will create later, complete the following steps.
From the AWS console, create a new Amazon SQS queue with the default settings. Use the following format when specifying a value for the Name field:
privacera-postgres-<RDS_CLUSTER_NAME>-audits
where:
RDS_CLUSTER_NAME
: Specifies the name of your RDS cluster.
After the queue is created save the URL of the queue for use later.
Specify an AWS Lambda function
To create an AWS Lambda function to interact with the SQS queue, complete the following steps. In addition to creating the function, you must create a new IAM policy and associate a new IAM role with the function. You need to know your AWS account ID and AWS region to complete this procedure.
From the IAM console, create a new IAM policy and input the following JSON:
{ "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Action": "logs:CreateLogGroup", "Resource": "arn:aws:logs:<REGION>:<ACCOUNT_ID>:*" }, { "Effect": "Allow", "Action": [ "logs:CreateLogStream", "logs:PutLogEvents" ], "Resource": [ "arn:aws:logs:<REGION>:<ACCOUNT_ID>:log-group:/aws/lambda/<LAMBDA_FUNCTION_NAME>:*" ] }, { "Effect": "Allow", "Action": "sqs:SendMessage", "Resource": "arn:aws:sqs:<REGION>:<ACCOUNT_ID>:<SQS_QUEUE_NAME>" } ] }
where:
REGION
: Specify your AWS region.ACCOUNT_ID
: Specify your AWS account ID.LAMBDA_FUNCTION_NAME
: Specify the name of the AWS Lambda function, which you will create later. For example:privacera-postgres-cluster1-audits
SQS_QUEUE_NAME
: Specify the name of the AWS SQS Queue.
Specify a name for the IAM policy, such as
privacera-postgres-audits-lambda-execution-policy
, and then create the policy.From the IAM console, create a new IAM role and choose for the Use case the Lambda option.
Search for the IAM policy that you just created with a name that might be similar to
privacera-postgres-audits-lambda-execution-policy
and select it.Specify a Role name for the IAM policy, such as
privacera-postgres-audits-lambda-execution-role
, and then create the role.From the AWS Lambda console, create a new function and specify the following fields:
Function name: Specify a name for the function, such as
privacera-postgres-cluster1-audits
.Runtime: Select Node.js 12.x from the list.
Permissions: Select Use an existing role and choose the role created earlier in this procedure, such as
privacera-postgres-audits-lambda-execution-role
.
Add a trigger to the function you created in the previous step and select CloudWatch Logs from the list, and then specify the following values:
Log group: Select the log group path for your Amazon RDS database instance, such as
/aws/rds/cluster/database-1/postgresql
.Filter name: Specify
auditTrigger
.
In the Lambda source code editor, provide the following JavaScript code in the
index.js
file, which is open by default in the editor:var zlib = require('zlib'); // CloudWatch logs encoding var encoding = process.env.ENCODING || 'utf-8'; // default is utf-8 var awsRegion = process.env.REGION || 'us-east-1'; var sqsQueueURL = process.env.SQS_QUEUE_URL; var ignoreDatabase = process.env.IGNORE_DATABASE; var ignoreUsers = process.env.IGNORE_USERS; var ignoreDatabaseArray = ignoreDatabase.split(','); var ignoreUsersArray = ignoreUsers.split(','); // Import the AWS SDK const AWS = require('aws-sdk'); // Configure the region AWS.config.update({region: awsRegion}); exports.handler = function (event, context, callback) { var zippedInput = Buffer.from(event.awslogs.data, 'base64'); zlib.gunzip(zippedInput, function (e, buffer) { if (e) { callback(e); } var awslogsData = JSON.parse(buffer.toString(encoding)); // Create an SQS service object const sqs = new AWS.SQS({apiVersion: '2012-11-05'}); console.log(awslogsData); if (awslogsData.messageType === 'DATA_MESSAGE') { // Chunk log events before posting awslogsData.logEvents.forEach(function (log) { //// Remove any trailing \n console.log(log.message) // Checking if message falls under ignore users/database var sendToSQS = true; if(sendToSQS) { for(var i = 0; i < ignoreDatabaseArray.length; i++) { if(log.message.toLowerCase().indexOf("@" + ignoreDatabaseArray[i]) !== -1) { sendToSQS = false; break; } } } if(sendToSQS) { for(var i = 0; i < ignoreUsersArray.length; i++) { if(log.message.toLowerCase().indexOf(ignoreUsersArray[i] + "@") !== -1) { sendToSQS = false; break; } } } if(sendToSQS) { let sqsOrderData = { MessageBody: JSON.stringify(log), MessageDeduplicationId: log.id, MessageGroupId: "Audits", QueueUrl: sqsQueueURL }; // Send the order data to the SQS queue let sendSqsMessage = sqs.sendMessage(sqsOrderData).promise(); sendSqsMessage.then((data) => { console.log("Sent to SQS"); }).catch((err) => { console.log("Error in Sending to SQS = " + err); }); } }); } }); };
For the Lambda function, edit the environment variables and create the following environment variables:
REGION
: Specify your AWS region.SQS_QUEUE_URL
: Specify your AWS SQS queue URL.IGNORE_DATABASE
: Specifyprivacera_db
.IGNORE_USERS
: Specify your database administrative user, such asprivacera
.
Create an IAM role for an EC2 instance
To create an IAM role for the AWS EC2 instance where you installed Privacera so that Privacera can read the AWS SQS queue, complete the following steps:
From the IAM console, create a new IAM policy and input the following JSON:
{ "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Action": [ "sqs:DeleteMessage", "sqs:GetQueueUrl", "sqs:ListDeadLetterSourceQueues", "sqs:ReceiveMessage", "sqs:GetQueueAttributes" ], "Resource": "<SQS_QUEUE_ARN>" }, { "Effect": "Allow", "Action": "sqs:ListQueues", "Resource": "*" } ] }
where:
SQS_QUEUE_ARN
: Specifies the AQS SQS Queue ARN identifier for the SQS Queue you created earlier.
Specify a name for the IAM policy, such as
postgres-audits-sqs-read-policy
, and create the policy.Attach the IAM policy to the AWS EC2 instance where you installed Privacera.