Send data to Snowflake on Azure¶
Snowflake is an analytic data warehouse that is fast, easy to use, and flexible. Snowflake uses a SQL database engine that is designed for the cloud. Snowflake can provide tables as a data source to Amperity.
Amperity can send data to customer-managed instances of Snowflake that are located in the same cloud platform as Amperity in the following ways:
To stages that are configured in the customer-managed instance
Sharing directly between an Amperity-managed and a customer-managed instances
This topic describes the steps that are required to send databases, tables, and query results to Snowflake from Amperity:
Get details¶
Snowflake requires the following configuration details:
The username and password. Tip Most users are configured with an initial password that must be changed. Create the user, manually change the password, and then use the updated password to configure this data source. |
|
The Azure Blob Storage account name and shared access signature for the dedicated role for use by Amperity. Important This role must be granted write permission to Snowflake objects, permission to access each table, and permission to write table output to Azure Blob Storage. |
|
The account name. |
|
The region in which Snowflake is located. Note The Amperity instance of Snowflake should be located in the same region as the customer’s instance of Snowflake. |
|
The stage name. |
|
A list of table names in the Snowflake database to be sent from Amperity. |
|
Your Snowflake account identifier for Snowflake secure data sharing. |
How do I find Snowflake account information?
You can find all of this information after you log into Snowflake.
Snowflake secure data sharing¶
Snowflake secure data sharing is not enabled by default. If your organization wants to use secure data sharing, your instance of Snowflake must be in the same region as the Amperity instance and you must provide your Snowflake account identifier to Amperity.
Amperity supports the following regions:
Cloud platform |
Regions |
---|---|
Amazon AWS |
aws-us-west-2 aws-us-east-1 |
Microsoft Azure |
az-us-east-2 |
Note
If your region is not listed you may make a request through your Amperity representative for Amperity engineering to add support for that region.
Configure objects¶
Amperity requires access to the customer’s instance of Snowflake. This requires write permissions to a warehouse object in Snowflake via a role and a user.
The following objects must be created in the customer’s instance of Snowflake:
A role via CREATE ROLE .
The role must be granted permission to the warehouse via GRANT USAGE .
A stage via CREATE STAGE .
Note
The external stage is used by the Amperity destination to stage data that is sent from Amperity in the Azure container. Snowflake picks data up from this location.
The CREATE STAGE command defaults to CSV file types.
When sending data from Amperity to Snowflake you must configure an external stage that points to the URL for the Azure container that is included with Amperity. (
URL = 'azure://account.blob.core.windows.net/container[/path/]'
).A warehouse via CREATE WAREHOUSE .
A user via CREATE USER .
The user must be added to the role via GRANT ROLE .
To configure Snowflake objects
To configure Snowflake objects use a Data Definition Language (DDL) command similar to:
CREATE ROLE AMPERITY;
CREATE DATABASE AMPERITY_DATABASE;
CREATE SCHEMA AMPERITY_SCHEMA.CUSTOMER_360;
CREATE STAGE AMPERITY_SCHEMA.CUSTOMER_360.AMPERITY_STAGE;
CREATE WAREHOUSE AMPERITY_WAREHOUSE WITH
WAREHOUSE_SIZE = 'XSMALL'
WAREHOUSE_TYPE = 'STANDARD'
AUTO_SUSPEND = 600
AUTO_RESUME = TRUE
MIN_CLUSTER_COUNT = 1
MAX_CLUSTER_COUNT = 2
SCALING_POLICY = 'ECONOMY';
GRANT USAGE ON STAGE AMPERITY_STAGE TO ROLE AMPERITY;
GRANT USAGE ON WAREHOUSE AMPERITY_WAREHOUSE TO ROLE AMPERITY;
CREATE USER amperity
PASSWORD = '********************'
DEFAULT_WAREHOUSE = AMPERITY_WAREHOUSE
DEFAULT_ROLE = AMPERITY;
GRANT ROLE AMPERITY TO USER amperity;
Note
You must send a customer who will connect directly to the Snowflake data warehouse the following information using a SnapPass link:
The URL for the Snowflake data warehouse.
The Snowflake username.
The password for that username.
Snowflake data warehouse name.
Configure user access¶
Users that will connect to Snowflake to use data in external applications, such as Tableau, Microsoft Power BI, and Databricks, will need to be granted read-only access to the Snowflake data warehouse.
To configure user access to a Snowflake data warehouse
CREATE ROLE USER_READ;
GRANT SELECT ON ALL TABLES IN SCHEMA AMPERITY_DATABASE.CUSTOMER_360 TO ROLE USER_READ;
GRANT SELECT ON FUTURE TABLES IN SCHEMA AMPERITY_DATABASE.CUSTOMER_360 TO ROLE USER_READ;
GRANT USAGE ON WAREHOUSE AMPERITY_WAREHOUSE ROLE USER_READ;
GRANT USAGE ON DATABASE AMPERITY_DATABASE TO ROLE USER_READ;
GRANT USAGE ON SCHEMA AMPERITY_DATABASE.CUSTOMER_360 TO ROLE USER_READ;
CREATE USER USERNAME
PASSWORD = 'PASSWORD'
DEFAULT_WAREHOUSE = AMPERITY_WAREHOUSE
DEFAULT_ROLE = USER_READ
FIRST_NAME = 'USER_FIRST_NAME'
LAST_NAME = 'USER_LAST_NAME'
EMAIL = 'USER_EMAIL_ADDRESS';
GRANT ROLE USER_READ TO USER USERNAME
Configure storage integration¶
Amperity can send database table data to Snowflake. This output should be written to Azure Blob Storage via a storage integration configured in Snowflake. This requires the Azure tenant ID and the full path to each allowed storage location.
For each database and schema to be output to the Amazon S3 bucket, specify a database and schema via USE , create a stage via CREATE , and then GRANT USAGE .
To configure Snowflake storage integration
To configure Snowflake storage integration use a Data Definition Language (DDL) command similar to:
USE DATABASE <database-name>
USE SCHEMA <schema-name>
CREATE STORAGE INTEGRATION <storage-integration-name>
TYPE = EXTERNAL_STAGE
STORAGE_PROVIDER = AZURE
ENABLED = TRUE
AZURE_TENANT_ID = '<tenant-id>'
STORAGE_ALLOWED_LOCATIONS = ('PATH', 'PATH', 'PATH')
GRANT USAGE ON STAGE <storage-integration-name> TO ROLE AMPERITY;
DESC STAGE <stage-name>;
where each PATH is similar to azure://<account>.blob.core.windows.net/<container>/<path>/
.
Configure tables¶
For each table to be sent from Amperity, the Amperity role in Snowflake must be given permission. This requires permissions on both the database and the schema that contain the table. Usage rights to the database and schema do not grant access to the entire database or schema. Additional table-specific grants are required via a combination of GRANT USAGE , USE , and GRANT SELECT .
To configure Snowflake tables
To configure Snowflake tables use a Data Definition Language (DDL) command similar to:
GRANT USAGE ON DATABASE <database-name> TO ROLE AMPERITY;
USE DATABASE <database-name>
GRANT USAGE ON SCHEMA <schema-name> TO ROLE AMPERITY;
USE SCHEMA <schema-name>;
GRANT SELECT ON TABLE <table-name> TO ROLE AMPERITY;
Define a database export¶
Database tables (or even entire databases) can be sent from Amperity to Snowflake as a database export. A database export is configured from the Customer 360 page, and then added to an orchestration from the Destinations page. A database export can be an ad hoc process or be scheduled as a regular drop to an external filedrop location or data warehouse.
To add a database export for Snowflake
From the Customer 360 page click Configure Exports. This opens the Database Exports page.
Click Create New Export. This opens the Add Export dialog box.
Add the name of the data export, and then click Add. This will add a draft data export to the Database Exports page.
Open the menu for the draft data export, and then select Edit.
From the Database drop-down, select a database.
From the Entities list, select one (or more) database tables to add to the data export.
Click Activate.
Add a Snowflake destination¶
Configure Amperity to send databases, tables, and query results directly to Snowflake.
To add a Snowflake destination
Open the Destinations tab to configure a destination for Snowflake. Click the Add Destination button to open the Destination dialog box. Enter a name for the destination and provide a description. For example: “Snowflake” and “This sends databases, tables, and query results to Snowflake”. From the Plugin drop-down, start typing “snow” to filter the list, and then select Snowflake. |
|||||||||||
Credentials allow Amperity to connect to Snowflake. The credential type is set automatically. You may use an existing credential or you may add a new one. Select an existing credential from the Credential drop-down. – or – Select Create a new credential from the Credential drop-down. This opens the Credential dialog box. Enter the name for the credential, and then add a description. The username and password should be the Amperity username and password created in Snowflake for the user security object. Add the Azure Blob Storage account name and shared access signature for the AZURE_ROLE with permission to access Azure Bob Storage used for Snowflake stages. When finished, click Save. |
|||||||||||
Each destination has settings that define how Amperity will deliver data to Snowflake. These settings are listed under the Settings section of the Destination dialog box. The following settings are specific to Snowflake:
|
|||||||||||
Business users are assigned to the Amp360 User and/or AmpIQ User policies. (Amp360 User allows access to queries and orchestrations and AmpIQ User allows access to segments and campaigns.) A business user cannot select a destination that is not visible to them. Business users – including users assigned to the DataGrid Operator policy – may have restricted access to PII. What is restricted access to PII? Restricted PII access is enabled when the Restrict PII access policy option that prevents users who are assigned to that option from viewing data that is marked as PII anywhere in Amperity and from sending that data to any downstream workflow. You can make this destination visible to orchestrations and allow users with restricted access to PII to use this destination by enabling one (or both) of the following options: |
|||||||||||
Review all settings, and then click Save. Important You must configure a data template for this destination before you can send data to Snowflake. |
Add data template¶
A data template defines how columns in Amperity data structures are sent to downstream workflows. A data template is part of the configuration for sending query and segment results from Amperity to an external location.
To add a data template
From the Destinations tab, open the menu for a destination that is configured for Snowflake, and then select Add data template. This opens the Add Data Template dialog box. Enter the name of the data template and a description. For example: “Snowflake” and “Send databases, tables, and query results to Snowflake.”. |
|||||||||
Verify business user access to queries and orchestrations and access to segments and campaigns. A business user may also have restricted access to PII, which prevents them from viewing and sending customer profile data. If business user access was not configured as part of the destination, you may configure access from the data template. |
|||||||||
Verify all configuration settings. Note When the settings required by Snowflake were are not configured as part of the destination, you must configure them as part of the data template before making this destination available to campaigns. |
|||||||||
Under Template Settings, specify the Snowflake Location. This is a period-delimited list of the Snowflake database name, the Snowflake schema name, and the Snowflake table name: SNOWFLAKE_DATABASE_NAME.SNOWFLAKE_SCHEMA.TABLE_NAME
Note Be sure that the table name is appended. For example: AMPERITY.CUSTOMER_360.TABLE_NAME
Specify the option to use for managing the table in Snowflake: create, drop, or truncate.
|
|||||||||
Review all settings, and then click Save. |