Send data to Snowflake on Azure¶
Snowflake is an analytic data warehouse that is fast, easy to use, and flexible. Snowflake uses a SQL database engine that is designed for the cloud. Snowflake can provide tables as a data source to Amperity.
Amperity can send data to Snowflake in the following ways:
Stages configured in Snowflake
Sharing directly with Amperity
Snowflake must be located in the same cloud platform as Amperity.
This topic describes the steps that are required to send databases, tables, and query results to Snowflake from Amperity:
Get details¶
The username and password.
Tip
Most users are configured with an initial password that must be changed. Create the user, manually change the password, and then use the updated password to configure this data source.
The Azure Blob Storage account name and shared access signature for the dedicated role for use by Amperity.
Important
This role must be granted write permission to Snowflake objects, permission to access each table, and permission to write table output to Azure Blob Storage.
The account name.
The region in which Snowflake is located.
Note
The Amperity instance of Snowflake should be located in the same region as the customer’s instance of Snowflake.
The stage name.
A list of table names in the Snowflake database to be sent from Amperity.
Your Snowflake account identifier for Snowflake secure data sharing.
How do I find Snowflake account information?
You can find all of this information after you log into Snowflake.
Snowflake secure data sharing¶
Snowflake secure data sharing is not enabled by default. If your organization wants to use secure data sharing, your instance of Snowflake must be in the same region as the Amperity instance and you must provide your Snowflake account identifier to Amperity.
Amperity supports the following regions:
Cloud platform |
Regions |
---|---|
Amazon AWS |
aws-us-west-2 aws-us-east-1 |
Microsoft Azure |
az-us-east-2 |
Note
If your region is not listed you may make a request through your Amperity representative for Amperity engineering to add support for that region.
Configure objects¶
Amperity requires access to the customer’s instance of Snowflake. This requires write permissions to a warehouse object in Snowflake via a role and a user.
The following objects must be created in the customer’s instance of Snowflake:
A role via CREATE ROLE .
The role must be granted permission to the warehouse via GRANT USAGE .
A stage via CREATE STAGE .
Note
The external stage is used by the Amperity destination to stage data that is sent from Amperity in the Azure container. Snowflake picks data up from this location.
The
CREATE STAGE
command defaults to CSV file types.When sending data from Amperity to Snowflake you must configure an external stage that points to the URL for the Azure container that is included with Amperity. (
URL = 'azure://account.blob.core.windows.net/container[/path/]'
).A warehouse via CREATE WAREHOUSE .
A user via CREATE USER .
The user must be added to the role via GRANT ROLE .
To configure Snowflake objects
To configure Snowflake objects use a Data Definition Language (DDL) command similar to:
CREATE ROLE AMPERITY;
CREATE DATABASE AMPERITY_DATABASE;
CREATE SCHEMA AMPERITY_SCHEMA.CUSTOMER_360;
CREATE STAGE AMPERITY_SCHEMA.CUSTOMER_360.AMPERITY_STAGE;
CREATE WAREHOUSE AMPERITY_WAREHOUSE WITH
WAREHOUSE_SIZE = 'XSMALL'
WAREHOUSE_TYPE = 'STANDARD'
AUTO_SUSPEND = 600
AUTO_RESUME = TRUE
MIN_CLUSTER_COUNT = 1
MAX_CLUSTER_COUNT = 2
SCALING_POLICY = 'ECONOMY';
GRANT ALL PRIVILEGES ON WAREHOUSE AMPERITY_WAREHOUSE TO ROLE AMPERITY;
GRANT ALL PRIVILEGES ON SCHEMA AMPERITY_SCHEMA.CUSTOMER_360 TO ROLE AMPERITY;
GRANT USAGE ON STAGE AMPERITY_STAGE TO ROLE AMPERITY;
GRANT USAGE ON WAREHOUSE AMPERITY_WAREHOUSE TO ROLE AMPERITY;
CREATE USER amperity
PASSWORD = '********************'
DEFAULT_WAREHOUSE = AMPERITY_WAREHOUSE
DEFAULT_ROLE = AMPERITY;
GRANT ROLE AMPERITY TO USER amperity;
Note
You must send a customer who will connect directly to the Snowflake data warehouse the following information using a SnapPass link:
The URL for the Snowflake data warehouse.
The Snowflake username.
The password for that username.
Snowflake data warehouse name.
Configure user access¶
Users that will connect to Snowflake to use data in external applications, such as Tableau, Microsoft Power BI, and Databricks, will need to be granted read-only access to the Snowflake data warehouse.
To configure user access to a Snowflake data warehouse
CREATE ROLE USER_READ;
GRANT SELECT ON ALL TABLES IN SCHEMA AMPERITY_DATABASE.CUSTOMER_360 TO ROLE USER_READ;
GRANT SELECT ON FUTURE TABLES IN SCHEMA AMPERITY_DATABASE.CUSTOMER_360 TO ROLE USER_READ;
GRANT USAGE ON WAREHOUSE AMPERITY_WAREHOUSE ROLE USER_READ;
GRANT USAGE ON DATABASE AMPERITY_DATABASE TO ROLE USER_READ;
GRANT USAGE ON SCHEMA AMPERITY_DATABASE.CUSTOMER_360 TO ROLE USER_READ;
CREATE USER USERNAME
PASSWORD = 'PASSWORD'
DEFAULT_WAREHOUSE = AMPERITY_WAREHOUSE
DEFAULT_ROLE = USER_READ
FIRST_NAME = 'USER_FIRST_NAME'
LAST_NAME = 'USER_LAST_NAME'
EMAIL = 'USER_EMAIL_ADDRESS';
GRANT ROLE USER_READ TO USER USERNAME
Configure storage integration¶
Amperity can send database table data to Snowflake. This output should be written to Azure Blob Storage via a storage integration configured in Snowflake. This requires the Azure tenant ID and the full path to each allowed storage location.
For each database and schema to be output to the Amazon S3 bucket, specify a database and schema via USE , create a stage via CREATE , and then GRANT USAGE .
To configure Snowflake storage integration
To configure Snowflake storage integration use a Data Definition Language (DDL) command similar to:
USE DATABASE <database-name>
USE SCHEMA <schema-name>
CREATE STORAGE INTEGRATION <storage-integration-name>
TYPE = EXTERNAL_STAGE
STORAGE_PROVIDER = AZURE
ENABLED = TRUE
AZURE_TENANT_ID = '<tenant-id>'
STORAGE_ALLOWED_LOCATIONS = ('PATH', 'PATH', 'PATH')
GRANT USAGE ON STAGE <storage-integration-name> TO ROLE AMPERITY;
DESC STAGE <stage-name>;
where each PATH is similar to azure://<account>.blob.core.windows.net/<container>/<path>/
.
Configure tables¶
For each table to be sent from Amperity, the Amperity role in Snowflake must be given permission. This requires permissions on both the database and the schema that contain the table. Usage rights to the database and schema do not grant access to the entire database or schema. Additional table-specific grants are required via a combination of GRANT USAGE , USE , and GRANT SELECT .
To configure Snowflake tables
To configure Snowflake tables use a Data Definition Language (DDL) command similar to:
GRANT USAGE ON DATABASE <database-name> TO ROLE AMPERITY;
USE DATABASE <database-name>
GRANT USAGE ON SCHEMA <schema-name> TO ROLE AMPERITY;
USE SCHEMA <schema-name>;
GRANT SELECT ON TABLE <table-name> TO ROLE AMPERITY;
Define a database export¶
Database tables (or even entire databases) can be sent from Amperity to Snowflake as a database export. A database export is configured from the Customer 360 tab, and then added to an orchestration from the Destinations tab. A database export can be an ad hoc process or be scheduled as a regular drop to an external filedrop location or data warehouse.
To add a database export for Snowflake
From the Customer 360 tab click Configure Exports. This opens the Database Exports page.
Click Create New Export. This opens the Add Export dialog box.
Add the name of the data export, and then click Add. This will add a draft data export to the Database Exports page.
Open the menu for the draft data export, and then select Edit.
From the Database drop-down, select a database.
From the Entities list, select one (or more) database tables to add to the data export.
Click Activate.
Add a Snowflake destination¶
Configure Amperity to send databases, tables, and query results directly to Snowflake.
To add a Snowflake destination
From the Destinations tab, click Add Destination. This opens the Add Destination dialog box.
Enter the name of the destination and a description. For example: “Snowflake” and “This sends data to Snowflake”.
From the Plugin drop-down, select Snowflake.
From the Credential Type drop-down, select a credential type: “azure-snowflake”.
From the Credential drop-down, select Create a new credential. This opens the Create New Credential dialog box.
Enter the username and password. This should be the Amperity username and password created in Snowflake for the user security object.
Add the Azure Blob Storage account name and shared access signature for the AZURE_ROLE with permission to access Azure Bob Storage used for Snowflake stages.
Under Snowflake Settings, enter the Snowflake account name, the region identifier, Snowflake stage name, and Snowflake data warehouse name.
The Account Name is contained within the URL for the Snowflake instance and is a character code located before
snowflakecomputing.com
. For example:ab12345
.The Region ID is the region ID in Azure. For example:
west-us-2
.The Stage is the Snowflake stage name, which is created by the CREATE STAGE command. For example:
AMPERITY_ABCDEF.CUSTOMER_360.AMP_PROD_STAGE
.The Warehouse is the name of the Snowflake data warehouse. This is created by the CREATE WAREHOUSE command. For example:
AMPERITY_WH
.Select Allow customers to use this data template.
Select Allow orchestrations from users with limited PII access.
Click Save.
Add data template¶
A data template defines how columns in Amperity data structures are sent to downstream workflows. A data template is part of the configuration for sending query and segment results from Amperity to an external location.
To add a data template