Managing the Snowflake Connector for Google Analytics Raw Data

This topic describes typical tasks you might need to perform after installing and configuring the connector.

Changing the ingestion interval for the connector

The connector periodically checks and downloads data from BigQuery. The check is done every 8 hours by default, but it can be changed. If you want to set the new interval for checking and downloading data, please use the CONFIGURE_INGESTION_INTERVAL procedure defined in the PUBLIC schema:


The list of supported intervals can be printed using the LIST_SUPPORTED_INGESTION_INTERVALS procedure defined in the PUBLIC schema:


Setting up alerts

To set up alerts, do the following:

  1. Sign in to Snowsight as a user with the ACCOUNTADMIN role.

  2. In the navigation menu, select Data Products » Apps.

  3. Search for the Snowflake Connector for Google Analytics Raw Data, then select the tile for the connector.

  4. In the page for the Snowflake Connector for Google Analytics Raw Data, go to the Settings section and then select Email alerts from the menu on the left.

    This displays a page for the email alerts configuration.

  5. In the Email Address field, provide a Snowflake verified email address.


You must specify an email address that is associated with the Snowflake account.

  1. In the Email Frequency field, select how often you would like to receive alerts:

  • Immediately - you will receive notifications according to the values set in table synchronization.

  • Once per day - you will receive notifications once a day at 12PM UTC.


Alerts are sent only when an invalid action (such as an error) occurs.

  1. Select Save changes to start receiving email alerts.

Disabling alerts

To stop receiving alerts, select Stop receiving alerts in the email alerts configuration page.

Upgrading the connector

The connector upgrades are managed automatically by the provider of the application.

Scaling the Connector

You should start your work with the Connector using a X-Small as it will most likely give you a sufficient performance. However, if you are experiencing any Connector slowdowns, you may want to try gradually increasing the warehouse size and evaluating whether you see any performance boosts at each step. Whether the Connector gains anything from scaling the warehouse depends on a few factors, such as the number of properties or the amount of data each of them has.

For insights on how to resize the warehouse see Resizing a warehouse in Working with warehouses.

Automatically disabling inaccessible Google Analytics properties

The Connector has a mechanism to automatically disable inaccessible Google Analytics properties in order to prevent unnecessary costs caused by attempting ingestions for data which does not exist indefinitely and alarm you that data is not being ingested anymore. The property is considered inaccessible and might be automatically disabled if data ingestions have been failing for the last 7 days.

Proceeding during disaster recovery and failover

If you want to ensure that the connector will be able to continue data ingestion during a deployment outage, you need to set up the sink database failover to a replica account. For details, see Failing over databases across multiple accounts.

Moreover, after an outage you need to manually install the Snowflake Connector for Google Analytics Raw Data on your replica account, because the connector itself can not be replicated. After the installation it will synchronize itself with the replicated sink database.


In order to prevent data corruption you can not have two connector instances, one on a primary account and one on a replica account ingesting data to the sink database at the same time.

When a deployment outage occurs and your sink database fails over to a replica account, perform the following steps:

  1. Sign in to your secondary account, where the sink database is replicated.

  2. Install the Snowflake Connector for Google Analytics Raw Data on your secondary account. The connector will synchronize itself with the replicated sink database. The instance on your primary account goes into a read-only state after an outage, so data will not be corrupted at this point.

  3. If you want to go back to the primary account after the deployment is available again, you need to first drop both connectors. It’s necessary to ensure a consistent connector state.

  4. Replicate the data back from the secondary account to the primary one using the replication mechanism.

  5. Reinstall the connector on a primary account once the data in sink table synchronizes with the sink table on your secondary account.