Accessing your subscriptions from your data warehouse
Last updated
Last updated
You can access your subscriptions directly from your data warehouses connected to CARTO. This is currently supported for BigQuery, Snowflake, AWS Redshift, Databricks and PostgreSQL.
This option is only available after you have created a connection to BigQuery from the Connections section.
Go to your subscription’s detail page in the Data Observatory section of the Data Explorer, then click on the Access in button and select the BigQuery option.
The following information will be displayed:
Location of the data table in BigQuery, in the format project.dataset.table
.
Location of the geography table in BigQuery, in the format project.dataset.table
.
Example query to join the data and geography tables.
Remember that your subscription is composed of a data and a geography table that you need to join in order to work with the data (read more about it here). Please use the example query provided to get started.
If your dataset subscription is a Geography (e.g. digital boundaries), such as the boundaries of Spain's Autonomous Communities, then only one table location will be shown:
All Data Observatory data is stored in BigQuery’s US Multi-region. You can learn more about regions here. If you would like us to make the Data Observatory data available in another region please contact our Technical Support.
If your BigQuery account is hosted in a different region to BigQuery's US Multi-region, you will need to send a request to CARTO in order for our team to transfer the data directly into your own region.
You can use the provided access information (table locations and sample query) to query your Data Observatory subscriptions directly from BigQuery. To do so, you need to operate with BigQuery authenticated with the same credentials you have used to set up your BigQuery connection(s) in CARTO:
If you have set up a connection of type OAuth (using the “Sign in with Google” option), you will be granted permissions to query your Data Observatory subscriptions using the BigQuery console while you are logged in with the same Google user.
If your have set up a connection using a Service Account instead, you will be granted permissions to query your Data Observatory subscriptions from any BigQuery client while you are logged in using such Service Account.
This option is only available after you have created a connection to Snowflake from the Connections section.
In order for you to access any of your Data Observatory subscriptions from Snowflake, the data first needs to be imported into your database. This import process is performed by our engineering team on a request basis.
To request it, go to the subscription’s page, click on the Access in button and choose a Snowflake connection.
If you have several Snowflake connections, you will be asked to select one to perform this action:
Finally, you will be asked to request access to the dataset from your Snowflake database.
Once we receive your request, we will get in touch with you to coordinate the import process. The data will be imported into a schema called CARTO
that will be created in the Snowflake database you have set up in your Snowflake connection.
If you would like to access more than one of your Data Observatory subscriptions from your Snowflake database, it is not necessary to request access for each of them individually, as we can import several datasets at once during the same process.
Once we have imported your Data Observatory subscription into your Snowflake database, you can check its access details through the Access in button, following the same steps described in the previous section.
The following information will be displayed:
Location of the data table in your Snowflake database, in the format YOURDATABASE.CARTO.TABLE
.
Location of the geography table in your Snowflake database, in the format YOURDATABASE.CARTO.TABLE
.
Example query to join the data and geography tables.
Access in Redshift, Databricks and PostgreSQL
Please note that the process to request access to your Data Observatory subscriptions in your AWS Redshift cluster, Databricks or PostgreSQL database is equivalent as the one described in the section related to Snowflake in this page.