Send data to analytics and BI tools

Amperity supports making data available to all of the leading analytics and business intelligence (BI) tools.

Send your data from Amperity to any of these tools, and then use that data to build dashboards along with anything else that is enabled by your favorite analytics and BI tools.

This topic consolidates the discussion of how to get data from Amperity to your favorite analytics and/or BI tools into a single topic, along with providing links to the documentation for each of the analytics and BI tools.

How analytics workflows work

Workflows that make data available to analytics and BI tools are done in two broad steps.

Send data from Amperity.

Use any of the following options:

  1. Send data from Amperity to cloud storage – one of Amazon S3, Azure Blob Storage, or Google Cloud Storage.

  2. Send data to Snowflake.

  3. Send data to a cloud database – Amazon Redshift, Azure Synapse Analytics, or Google BigQuery.

  4. Send to a location from which you can use Open Database Connectivity (ODBC) or Java Database Connectivity (JDBC) to connect to the data.

The option that you choose for this step depend on which analytics and/or BI tools you use and/or what type of cloud infrastructure is available to your organization. Many analytics and BI tools support using any of these options. Cloud-based workflows are the most common.

Load data to your favorite analytics or BI tool.

Load the data to the analytics and BI tools using the processes and steps that are described in each tool’s documentation.

Analytics and BI tools

The following sections lists some popular analytics and business intelligence tools alphabetically, and then each section describes the options that are available.

Amazon QuickSight

Amazon QuickSight is a cloud-based, self-service BI tool for creating and publishing interactive dashboards for retail, ecommerce, manufacturing, and more.

  1. Send CSV or TSV files from Amperity to Amazon S3.

  2. Configure Amazon AWS to create a dataset using Amazon S3 files .

Domo

Domo is a cloud-based, self-service BI tool that helps you visualize data from a single dashboard.

You can use a variety of data visualization workflows to analyze data in Domo. Send query results from Amperity to a customer-managed Amazon S3 bucket, and then load that data to Domo as a DataSet .

Google BigQuery

Google BigQuery is a fully-managed data warehouse that provides scalable, cost-effective, serverless software that can perform fast analysis over petabytes of data and querying using ANSI SQL.

  1. Send CSV or Parquet files from Amperity to Amazon S3 or Google Cloud Storage.

  2. Load Parquet files or CSV files from cloud storage or transfer Parquet and CSV files from Amazon S3 .

  3. Enable downstream workflows in analytics and BI tools like Microsoft Power BI, or Tableau. Support or enable real-time downstream workflows with Google Pub/Sub .

Looker

Looker is an enterprise platform for business intelligence, data applications, and embedded analytics.

You can connect Looker to cloud databases, including Amazon Redshift , Azure Synapse Analytics , Google BigQuery , and Snowflake .

Microsoft Power BI

Microsoft Power BI is a collection of software services, applications, and connectors that work together to turn unrelated sources of data into coherent, visually immersive, and interactive insights.

Data is not sent from Amperity directly to Microsoft Power BI. Microsoft Power BI must connect to a location that supports queries to that data; Microsoft Power BI cannot connect directly to a static file. Amperity must send data to Microsoft Power BI indirectly by configuring a destination to:

  1. Send a CSV file to an Azure container, after which it is picked up by Azure Synapse Analytics.

  2. Send a CSV file to Google Cloud Storage, after which it is transferred to Google BigQuery

  3. Send data to any supported connector .

Microsoft Power BI may be configured to connect directly to Snowflake, Google BigQuery, or Azure Synapse Analytics. The destination workflow in Amperity may be configured to send data on a regular basis to ensure that the data available to the Microsoft Power BI user is up to date.

Pyramid Analytics

Pyramid Analytics supports using DataFlow sources to pull data from Amazon S3 and Azure Blob Storage.

Sisense

Review the Sisense connectors list and choose the option that works best for you. A variety of options are supported, including Amazon Redshift, Amazon S3, Azure Blob Storage, Google BigQuery, Google Cloud Storage, and Snowflake.

Tableau

Tableau is a visual analytics platform that enables people and organizations to make the most of their data. Tableau connects to a data source, and then queries that data directly.

Data is not sent from Amperity directly to Tableau. Tableau must connect to a location that supports queries to that data; Tableau cannot connect directly to a static file. Amperity must send data to Tableau indirectly by configuring a destination to:

  1. Send a CSV file to an Amazon S3 bucket, after which it is picked up by Amazon Redshift.

  2. Send a CSV file to an Azure container, after which it is picked up by Azure Synapse Analytics.

  3. Send a CSV file to Google Cloud Storage, after which data is transferred to Google BigQuery.

  4. Send Apache Parquet, Apache Avro, CSV, or JSON files to Snowflake, after which you then connect to that data from Tableau.

Tableau may be configured to connect directly to Snowflake, Amazon Redshift, or Azure Synapse Analytics. The destination workflow in Amperity may be configured to send data on a regular basis to ensure that the data available to the Tableau user is up to date.

Tellius

Tellius is a platform for analytics and BI that unifies visual data intelligence, actionable insights, machine learning capabilities, and custom predictive models.

You can send data to cloud storage and then move that data into a cloud database using any connector that is supported by Tellius .

ThoughtSpot

Use DataFlow connections to make data available to ThoughtSpot. A variety of options are supported, including Amazon Redshift, Amazon S3, Azure Blob Storage, Google BigQuery, Google Cloud Storage, and Snowflake.

Yellowfin

Download JDBC drivers for Amazon Redshift, Amazon S3, Google BigQuery, Google Cloud Storage, and Snowflake. Send data to any of these cloud storage locations, connect the JDBC driver to that source, and then open the data in Yellowfin.

Zoho Analytics

You can import data from feeds or import data from an Amazon S3 bucket .