Send data to Domo¶
Domo is a cloud-based, self-service BI tool that helps you visualize data from a single dashboard.
You can use a variety of data visualization workflows to analyze data in Domo. Send query results from Amperity to a customer-managed Amazon S3 bucket, and then load that data to Domo as a DataSet . You may also send query results to Databricks, and then use the Databricks Connector to load data to Domo.
This topic describes the steps that are required to send data to Domo from Amperity:
Get details¶
Amperity can be configured to send data to a customer-managed Amazon S3 bucket . Send data from Amperity to a customer-managed Amazon S3 bucket using cross-account roles, and then connect Domo to that Amazon S3 bucket.
Configure cross-account roles¶
Amperity prefers to pull data from and send data to customer-managed cloud storage.
Amperity requires using cross-account role assumption to manage access to Amazon S3 to ensure that customer-managed security policies control access to data.
This approach ensures that customers can:
Directly manage the IAM policies that control access to data
Directly manage the files that are available within the Amazon S3 bucket
Modify access without requiring involvement by Amperity; access may be revoked at any time by either Amazon AWS account, after which data sharing ends immediately
Directly troubleshoot incomplete or missing files
Note
After setting up cross-account role assumption, a list of files (by filename and file type), along with any sample files, must be made available to allow for feed creation. These files may be placed directly into the shared location after cross-account role assumption is configured.
Can I use an Amazon AWS Access Point?
Yes, but with the following limitations:
The direction of access is Amperity access files that are located in a customer-managed Amazon S3 bucket
A credential-free role-to-role access pattern is used
Traffic is not restricted to VPC-only
To configure an S3 bucket for cross-account role assumption
The following steps describe how to configure Amperity to use cross-account role assumption to pull data from (or push data to) a customer-managed Amazon S3 bucket.
Important
These steps require configuration changes to customer-managed Amazon AWS accounts and must be done by users with administrative access.
Add destination¶
Configure Amperity to send data directly to Domo.
To add a destination
![]() |
Open the Destinations tab to configure a destination for Domo. Click the Add Destination button to open the Destination dialog box. ![]() Important Use the Amazon S3 destination that is built into Amperity to send data to a customer-managed Amazon S3 bucket, from which Domo is configured to pull data. Enter a name for the destination and provide a description. For example: “Domo” and “This sends data to Domo”. From the Plugin drop-down, start typing “s3” to filter the list, and then select Domo. |
![]() |
Credentials allow Amperity to connect to Domo. The credential type is set automatically. You may use an existing credential or you may add a new one. ![]() Select an existing credential from the Credential drop-down. – or – Select Create a new credential from the Credential drop-down. This opens the Credential dialog box. ![]() Enter the name for the credential, and then add a description. Domo has the following settings:
When finished, click Save. |
![]() |
Each destination has settings that define how Amperity will deliver data to Domo. These settings are listed under the Settings section of the Destination dialog box. ![]() Complete the following Amazon S3 Settings:
|
![]() |
Business users are assigned to the Amp360 User and/or AmpIQ User policies. (Amp360 User allows access to queries and orchestrations and AmpIQ User allows access to segments and campaigns.) A business user cannot select a destination that is not visible to them. Business users – including users assigned to the DataGrid Operator policy – may have restricted access to PII. What is restricted access to PII? Restricted PII access is enabled when the Restrict PII Access policy option that prevents users who are assigned to that option from viewing data that is marked as PII anywhere in Amperity and from sending that data to any downstream workflow. You can make this destination visible to orchestrations and allow users with restricted access to PII to use this destination by enabling one (or both) of the following options: ![]() |
![]() |
Review all settings, and then click Save. ![]() Important You must configure a data template for this destination before you can send data to Domo. |
Add data template¶
A data template defines how columns in Amperity data structures are sent to downstream workflows. A data template is part of the configuration for sending query and segment results from Amperity to an external location.
To add a data template
Workflow actions¶
A workflow will occasionally show an error that describes what prevented a workflow from completing successfully. These first appear as alerts in the notifications pane. The alert describes the error, and then links to the Workflows tab.
Open the Workflows tab to review a list of workflow actions, choose an action to resolve the workflow error, and then follow the steps that are shown.
Invalid bucket name¶
The name of the Amazon S3 bucket to which Amperity pushes data must be correctly specified in the configuration for the destination in the Destinations page.
To resolve this error, do the following.
Open the AWS management console and verify the name of the Amazon S3 bucket.
Open the Destinations page in Amperity, and then open the destination that is associated with this workflow.
Update the destination for the correct Amazon S3 bucket name.
Return to the workflow action, and then click Resolve to retry.
Invalid credentials¶
The credentials that are defined in Amperity are invalid.
To resolve this error, verify that the credentials required by this workflow are valid.
Open the Credentials page.
Review the details for the credentials used with this workflow. Update the credentials for Domo if required.
Return to the workflow action, and then click Resolve to retry this workflow.