LogoLogo
HomeAcademyLoginTry for free
  • Welcome
  • What's new
    • Q2 2025
    • Q1 2025
    • Q4 2024
    • Q3 2024
    • Q2 2024
    • Q1 2024
    • Q4 2023
    • Q3 2023
    • Q2 2023
    • Q1 2023
    • Q4 2022
    • Q3 2022
  • FAQs
    • Accounts
    • Migration to the new platform
    • User & organization setup
    • General
    • Builder
    • Workflows
    • Data Observatory
    • Analytics Toolbox
    • Development Tools
    • Deployment Options
    • CARTO Basemaps
    • CARTO for Education
    • Support Packages
    • Security and Compliance
  • Getting started
    • What is CARTO?
    • Quickstart guides
      • Connecting to your data
      • Creating your first map
      • Creating your first workflow
      • Developing your first application
    • CARTO Academy
  • CARTO User Manual
    • Overview
      • Creating your CARTO organization
      • CARTO Cloud Regions
      • CARTO Workspace overview
    • Maps
      • Data sources
        • Simple features
        • Spatial Indexes
        • Pre-generated tilesets
        • Rasters
        • Defining source spatial data
        • Managing data freshness
        • Changing data source location
      • Layers
        • Point
          • Grid point aggregation
          • H3 point aggregation
          • Heatmap point aggregation
          • Cluster point aggregation
        • Polygon
        • Line
        • Grid
        • H3
        • Raster
        • Zoom to layer
      • Widgets
        • Formula widget
        • Category widget
        • Pie widget
        • Histogram widget
        • Range widget
        • Time Series widget
        • Table widget
      • SQL Parameters
        • Date parameter
        • Text parameter
        • Numeric parameter
        • Publishing SQL parameters
      • Interactions
      • Legend
      • Basemaps
        • Basemap selector
      • AI Agents
      • SQL analyses
      • Map view modes
      • Map description
      • Feature selection tool
      • Search locations
      • Measure distances
      • Exporting data
      • Download PDF reports
      • Managing maps
      • Sharing and collaboration
        • Editor collaboration
        • Map preview for editors
        • Map settings for viewers
        • Comments
        • Embedding maps
        • URL parameters
      • Performance considerations
    • Workflows
      • Workflow canvas
      • Results panel
      • Components
        • Aggregation
        • Custom
        • Data Enrichment
        • Data Preparation
        • Generative AI
        • Input / Output
        • Joins
        • Parsers
        • Raster Operations
        • Spatial Accessors
        • Spatial Analysis
        • Spatial Constructors
        • Spatial Indexes
        • Spatial Operations
        • Statistics
        • Tileset Creation
        • BigQuery ML
        • Snowflake ML
        • Google Earth Engine
        • Google Environment APIs
        • Telco Signal Propagation Models
      • Data Sources
      • Scheduling workflows
      • Sharing workflows
      • Using variables in workflows
      • Executing workflows via API
      • Temporary data in Workflows
      • Extension Packages
      • Managing workflows
      • Workflows best practices
    • Data Explorer
      • Creating a map from your data
      • Importing data
        • Importing rasters
      • Geocoding data
      • Optimizing your data
    • Data Observatory
      • Terminology
      • Browsing the Spatial Data Catalog
      • Subscribing to public and premium datasets
      • Accessing free data samples
      • Managing your subscriptions
      • Accessing your subscriptions from your data warehouse
        • Access data in BigQuery
        • Access data in Snowflake
        • Access data in Databricks
        • Access data in Redshift
        • Access data in PostgreSQL
    • Connections
      • Google BigQuery
      • Snowflake
      • Databricks
      • Amazon Redshift
      • PostgreSQL
      • CARTO Data Warehouse
      • Sharing connections
      • Deleting a connection
      • Required permissions
      • IP whitelisting
      • Customer data responsibilities
    • Applications
    • Settings
      • Understanding your organization quotas
      • Activity Data
        • Activity Data Reference
        • Activity Data Examples
        • Activity Data Changelog
      • Users and Groups
        • Inviting users to your organization
        • Managing user roles
        • Deleting users
        • SSO
        • Groups
        • Mapping groups to user roles
      • CARTO Support Access
      • Customizations
        • Customizing appearance and branding
        • Configuring custom color palettes
        • Configuring your organization basemaps
        • Enabling AI Agents
      • Advanced Settings
        • Managing applications
        • Configuring S3 Bucket for Redshift Imports
        • Configuring OAuth connections to Snowflake
        • Configuring OAuth U2M connections to Databricks
        • Configuring S3 Bucket integration for RDS for PostgreSQL Exports in Builder
        • Configuring Workload Identity Federation for BigQuery
      • Data Observatory
      • Deleting your organization
    • Developers
      • Managing Credentials
        • API Base URL
        • API Access Tokens
        • SPA OAuth Clients
        • M2M OAuth Clients
      • Named Sources
  • Data and Analysis
    • Analytics Toolbox Overview
    • Analytics Toolbox for BigQuery
      • Getting access
        • Projects maintained by CARTO in different BigQuery regions
        • Manual installation in your own project
        • Installation in a Google Cloud VPC
        • Core module
      • Key concepts
        • Tilesets
        • Spatial indexes
      • SQL Reference
        • accessors
        • clustering
        • constructors
        • cpg
        • data
        • http_request
        • import
        • geohash
        • h3
        • lds
        • measurements
        • placekey
        • processing
        • quadbin
        • random
        • raster
        • retail
        • routing
        • s2
        • statistics
        • telco
        • tiler
        • transformations
      • Guides
        • Running queries from Builder
        • Working with Raster data
      • Release notes
      • About Analytics Toolbox regions
    • Analytics Toolbox for Snowflake
      • Getting access
        • Native App from Snowflake's Marketplace
        • Manual installation
      • Key concepts
        • Spatial indexes
        • Tilesets
      • SQL Reference
        • accessors
        • clustering
        • constructors
        • data
        • http_request
        • import
        • h3
        • lds
        • measurements
        • placekey
        • processing
        • quadbin
        • random
        • raster
        • retail
        • s2
        • statistics
        • tiler
        • transformations
      • Guides
        • Running queries from Builder
        • Working with Raster data
      • Release Notes
    • Analytics Toolbox for Databricks
      • Getting access
        • Personal (former Single User) cluster
        • Standard (former Shared) cluster
      • Reference
        • lds
        • tiler
      • Guides
      • Release Notes
    • Analytics Toolbox for Redshift
      • Getting access
        • Manual installation in your database
        • Installation in an Amazon Web Services VPC
        • Core version
      • Key concepts
        • Tilesets
        • Spatial indexes
      • SQL Reference
        • clustering
        • constructors
        • data
        • http_request
        • import
        • lds
        • placekey
        • processing
        • quadbin
        • random
        • s2
        • statistics
        • tiler
        • transformations
      • Guides
        • Running queries from Builder
      • Release Notes
    • Analytics Toolbox for PostgreSQL
      • Getting access
        • Manual installation
        • Core version
      • Key concepts
        • Tilesets
        • Spatial Indexes
      • SQL Reference
        • h3
        • quadbin
        • tiler
      • Guides
        • Creating spatial index tilesets
        • Running queries from Builder
      • Release Notes
    • CARTO + Python
      • Installation
      • Authentication Methods
      • Visualizing Data
      • Working with Data
        • How to work with your data in the CARTO Data Warehouse
        • How to access your Data Observatory subscriptions
        • How to access CARTO's Analytics Toolbox for BigQuery and create visualizations via Python notebooks
        • How to access CARTO’s Analytics Toolbox for Snowflake and create visualizations via Python notebooks
        • How to visualize data from Databricks
      • Reference
    • CARTO QGIS Plugin
  • CARTO for Developers
    • Overview
    • Key concepts
      • Architecture
      • Libraries and APIs
      • Authentication methods
        • API Access Tokens
        • OAuth Access Tokens
        • OAuth Clients
      • Connections
      • Data sources
      • Visualization with deck.gl
        • Basemaps
          • CARTO Basemap
          • Google Maps
            • Examples
              • Gallery
              • Getting Started
              • Basic Examples
                • Hello World
                • BigQuery Tileset Layer
                • Data Observatory Tileset Layer
              • Advanced Examples
                • Arc Layer
                • Extrusion
                • Trips Layer
            • What's New
          • Amazon Location
            • Examples
              • Hello World
              • CartoLayer
            • What's New
        • Rapid Map Prototyping
      • Charts and widgets
      • Filtering and interactivity
      • Summary
    • Quickstart
      • Make your first API call
      • Visualize your first dataset
      • Create your first widget
    • Guides
      • Build a public application
      • Build a private application
      • Build a private application using SSO
      • Visualize massive datasets
      • Integrate CARTO in your existing application
      • Use Boundaries in your application
      • Avoid exposing SQL queries with Named Sources
      • Managing cache in your CARTO applications
    • Reference
      • Deck (@deck.gl reference)
      • Data Sources
        • vectorTableSource
        • vectorQuerySource
        • vectorTilesetSource
        • h3TableSource
        • h3QuerySource
        • h3TilesetSource
        • quadbinTableSource
        • quadbinQuerySource
        • quadbinTilesetSource
        • rasterSource
        • boundaryTableSource
        • boundaryQuerySource
      • Layers (@deck.gl/carto)
      • Widgets
        • Data Sources
        • Server-side vs. client-side
        • Models
          • getFormula
          • getCategories
          • getHistogram
          • getRange
          • getScatter
          • getTimeSeries
          • getTable
      • Filters
        • Column filters
        • Spatial filters
      • CARTO APIs Reference
    • Release Notes
    • Examples
    • CARTO for React
      • Guides
        • Getting Started
        • Views
        • Data Sources
        • Layers
        • Widgets
        • Authentication and Authorization
        • Basemaps
        • Look and Feel
        • Query Parameters
        • Code Generator
        • Sample Applications
        • Deployment
        • Upgrade Guide
      • Examples
      • Library Reference
        • Introduction
        • API
        • Auth
        • Basemaps
        • Core
        • Redux
        • UI
        • Widgets
      • Release Notes
  • CARTO Self-Hosted
    • Overview
    • Key concepts
      • Architecture
      • Deployment requirements
    • Quickstarts
      • Single VM deployment (Kots)
      • Orchestrated container deployment (Kots)
      • Advanced Orchestrated container deployment (Helm)
    • Guides
      • Guides (Kots)
        • Configure your own buckets
        • Configure an external in-memory cache
        • Enable Google Basemaps
        • Enable the CARTO Data Warehouse
        • Configure an external proxy
        • Enable BigQuery OAuth connections
        • Configure Single Sign-On (SSO)
        • Use Workload Identity in GCP
        • High availability configuration for CARTO Self-hosted
        • Configure your custom service account
      • Guides (Helm)
        • Configure your own buckets (Helm)
        • Configure an external in-memory cache (Helm)
        • Enable Google Basemaps (Helm)
        • Enable the CARTO Data Warehouse (Helm)
        • Configure an external proxy (Helm)
        • Enable BigQuery OAuth connections (Helm)
        • Configure Single Sign-On (SSO) (Helm)
        • Use Workload Identity in GCP (Helm)
        • Use EKS Pod Identity in AWS (Helm)
        • Enable Redshift imports (Helm)
        • Migrating CARTO Self-hosted installation to an external database (Helm)
        • Advanced customizations (Helm)
        • Configure your custom service account (Helm)
    • Maintenance
      • Maintenance (Kots)
        • Updates
        • Backups
        • Uninstall
        • Rotating keys
        • Monitoring
        • Change the Admin Console password
      • Maintenance (Helm)
        • Monitoring (Helm)
        • Rotating keys (Helm)
        • Uninstall (Helm)
        • Backups (Helm)
        • Updates (Helm)
    • Support
      • Get debug information for Support (Kots)
      • Get debug information for Support (Helm)
    • CARTO Self-hosted Legacy
      • Key concepts
        • Architecture
        • Deployment requirements
      • Quickstarts
        • Single VM deployment (docker-compose)
      • Guides
        • Configure your own buckets
        • Configure an external in-memory cache
        • Enable Google Basemaps
        • Enable the CARTO Data Warehouse
        • Configure an external proxy
        • Enable BigQuery OAuth connections
        • Configure Single Sign-On (SSO)
        • Enable Redshift imports
        • Configure your custom service account
        • Advanced customizations
        • Migrating CARTO Self-Hosted installation to an external database
      • Maintenance
        • Updates
        • Backups
        • Uninstall
        • Rotating keys
        • Monitoring
      • Support
    • Release Notes
  • CARTO Native App for Snowflake Containers
    • Deploying CARTO using Snowflake Container Services
  • Get Help
    • Legal & Compliance
    • Previous libraries and components
    • Migrating your content to the new CARTO platform
Powered by GitBook
On this page
  • Install CARTO Analytics Toolbox inside your BigQuery project
  • Deploy the infrastructure needed to allow Location Data Services usage
  • Architecture overview
  • 1. Configure a BQ connection to enable requests to Cloud Run services
  • 2. Deploy the AT Gateway container in Cloud Run
  • 3. Create DNS entry for CARTO Self-hosted platform
  • 4. Check firewall rules to ensure that Cloud Run can reach the Self-hosted instance
  • Configure the AT Gateway in your CARTO Analytics Toolbox installation

Was this helpful?

Export as PDF
  1. Data and Analysis
  2. Analytics Toolbox for BigQuery
  3. Getting access

Installation in a Google Cloud VPC

PreviousManual installation in your own projectNextCore module

Last updated 9 months ago

Was this helpful?

This guide will walk you through the process of configuring the CARTO Analytics Toolbox to work within a VCP with a CARTO Self-hosted installation within Google Cloud Platform.

Is your CARTO Self Hosted deployment in a Google Cloud VPC?

When the CARTO platform is self hosted within a Google Cloud VPC, the functions and procedures of the Analytics Toolbox need to be accessed from within the same VPC.

That makes this installation method the only suitable one for this kind of CARTO platform's deployment.

Install CARTO Analytics Toolbox inside your BigQuery project

The first step would be to .

Once the Analytics Toolbox is installed in your project, use this guide to deploy the AT Gateway in your VPC.

Deploy the infrastructure needed to allow Location Data Services usage

Some functionalities of the CARTO Analytics Toolbox for BigQuery require making external calls from BigQuery to CARTO services. These calls are implemented via :

  • AT Gateway: Creation of isolines, geocoding and routing require making calls to CARTO LDS API. Some functions of the Analytics Toolbox require making a request to the CARTO Platform backend (like importing from a URL or the 'Send by Email' component in Workflows) . For this purpose, Cloud Run functions need to be deployed in your VPC.

When installing the Analytics Toolbox manually in your own project, there is some configuration required:

  • Create a that will allow to call Cloud Run functions from BigQuery.

  • An AT Gateway endpoint inside your VPC.

Architecture overview

To deploy the Analytics Toolbox within a VPC, the CARTO platform needs to deploy some additional infrastructure pieces within your GCP project. In the following diagram, you can check how all these pieces interact with each other:

We'll set up the following pieces inside your project to start using the Analytics Toolbox on your CARTO Self-hosted platform:

  • One BQ connection used to perform requests agains two different Cloud Run services.

  • One subnetwork used to deploy the containers created by the two Cloud Run services that are required.

  • One Cloud Run service needed for BigQuery to interact with the Self-hosted platform.

  • One VPC Serverless Access Connector that will be used by the Cloud Run services to access your VPC.

  • An internal DNS record pointing to the IP address of your CARTO Self-hosted platform.

You just need to follow the following steps to set up the required infrastructure pieces:

All following commands and instructions should be executed from the Cloud Shell in your console or from authenticated gcloudand bqCLI sessions.

1. Configure a BQ connection to enable requests to Cloud Run services

Your BigQuery project will need to make requests to the two Cloud Run services configured in this guide. To configure the BQ connection that allows this usage, you'll need to run the following command:

Create a connection from a command line:

bq mk \
    --connection \
    --project_id={PROJECT_ID} \
    --location={REGION} \
    --connection_type=CLOUD_RESOURCE \
    carto-conn

Replace the following:

  • PROJECT_ID: your Google Cloud project ID

Once the connection has been configured, GCP will automatically create a service account that we'll need to use to grant permissions to access the cloud runs. You can check that the service account has been created correctly by running the following command:

Obtain the Service Account created when configuring a BQ connection:

bq show --format json \ 
    --connection {PROJECT_ID}.{REGION}.carto-conn

Replace the following:

  • PROJECT_ID: your Google Cloud project ID

  • REGION: your connection region

2. Deploy the AT Gateway container in Cloud Run

  • Create subnet for the VPC Access Connector:

gcloud compute networks subnets create vpc-conn-carto \ 
    --network={VPC_NETWORK} \
    --range={SUBNETWORK_IPS_RANGE} \
    --region={REGION} \
    --project={PROJECT_ID} \
    --enable-private-ip-google-access

Replace the following:

  • VPC_NETWORK: the name of the network created in your VPC project

  • SUBNETWORK_IPS_RANGE: the range of IPs that this subnetwork will use

The IPs range selected for the subnetwork must be created using a CIDR /28 block

  • REGION: the same GCP region used when creating the BQ connection in the previous step. This region has to be exactly the same used to create the BQ connection

  • PROJECT_ID: your Google Cloud project ID

Now that the subnet is correctly configured, you'll need to create a Serverless VPC Access connector for the Cloud Run services.

  • Create connector for the Cloud Run services:

gcloud compute networks vpc-access connectors create carto-vpc-access-conn \
    --project={PROJECT_ID} \
    --region={REGION} \
    --subnet=vpc-conn-carto \
    --min-instances=2 \
    --max-instances=5 \
    --machine-type=e2-micro

Replace the following variables:

  • PROJECT_ID: your Google Cloud project ID

  • REGION: the same GCP region used when creating the BQ connection in the previous step

Once the connector has been correctly created, we can proceed with the Cloud Run services deployment. You'll have to execute the following commands:

  • Deploy AT Gateway service

gcloud run deploy carto-at-gateway  \
    --project={PROJECT_ID} \
    --region={REGION} \
    --tag=carto-at-gateway \
    --allow-unauthenticated \
    --vpc-connector=carto-vpc-access-conn \
    --vpc-egress=all-traffic \
    --ingress=internal \
    --port=8080 \
    --set-env-vars=AT_GATEWAY_CLOUD_RUN_REGION={REGION},NODE_TLS_REJECT_UNAUTHORIZED=0 \
    --command=npm \
    --args=run,start:cloud-run \
    --image=gcr.io/carto-onprem-artifacts/at-gateway/cloud-run:latest

The NODE_TLS_REJECT_UNAUTHORIZED environment variable is used to disable the verification of custom TLS certificates in the Self-hosted deployment

Replace the following variables:

  • PROJECT_ID: your Google Cloud project ID

  • REGION: the same GCP region used when creating the BQ connection in the previous step

3. Create DNS entry for CARTO Self-hosted platform

The AT Gateway service will need to access the CARTO Self-hosted LDS API to perform requests to the different LDS providers. As the requests will be handled inside the VPC, it's mandatory to add an internal DNS registry so that the Cloud Run service can reach the CARTO platform APIs.

Firstly, we have to obtain the internal IP address of the CARTO Self-hosted platform. Once the internal IP has been obtained, you can create a DNS zone inside GCP using the following command:

If you already have an internal DNS configured in your GCP project you can skip this step and directly add a new domain pointing to the CARTO platform internal IP address.

gcloud dns managed-zones create carto-io \
    --project={PROJECT_ID} \
    --dns-name={DNS_ZONE_NAME} \
    --description="Internal DNS zone for CARTO selfhosted" \
    --networks={VPC_NETWORK} \
    --visibility=private

Replace the following variables:

  • PROJECT_ID: your Google Cloud project ID

  • DNS_ZONE_NAME: the name that will use your new DNS zone

  • VPC_NETWORK: name of the VPC network created in your GCP project

Then we'll have to create a new registry inside the new DNS zone, configuring a domain that points to CARTO Self-hosted platform's internal IP address:

  1. Start a transaction to add a record in your DNS zone

gcloud dns record-sets transaction start \
    --project={PROJECT_ID} \
    --zone={DNS_ZONE}
  1. Add the new domain to your DNS zone

gcloud dns record-sets transaction add {CARTO_PLATFORM_IP} \
    --project={PROJECT_ID} \
    --name={INTERNAL_DOMAIN} \
    --ttl=300 \
    --type=A \
    --zone={DNS_ZONE}
  1. Execute the transaction to write the new changes in your DNS zone

gcloud dns record-sets transaction execute \
    --project={YOUR_PROJECT_ID} \
    --zone={DNS_ZONE}

Replace the following:

  • PROJECT_ID: your Google Cloud project ID

  • DNS_ZONE: the name of your DNS zone

  • CARTO_PLATFORM_IP: internal IP address of your CARTO Self-hosted deployment

  • INTERNAL_DOMAIN: the internal domain that will be pointing to your CARTO Self-hosted deployment inside your VPC

You'll have to change CARTO_PLATFORM_IP variable in the previous command for the one used by your CARTO Self-hosted installation.

4. Check firewall rules to ensure that Cloud Run can reach the Self-hosted instance

Cloud Run services needs access to the CARTO Self-hosted environment, so you'll have to check that the firewall rules configured on your project allow the traffic between these two pieces.

The CARTO Self-hosted platform has to be accessible through the 443 port, and it should be allowed to respond requests performed by the Cloud Run services deployed in the previous steps.

All requests will be handled inside the VPC, so all network traffic involved in this process will take place between the subnetworks created and the CARTO Self-hosted instance.

Configure the AT Gateway in your CARTO Analytics Toolbox installation

Now that we've both installed the Analytics Toolbox and deployed the required infrastructure pieces in GCP, we have to configure the Analytics Toolbox so that it's able to use the AT Gateway.

The Analytics Toolbox provides a procedure to update the required configuration values to start using the remote functions needed. These functions can be configured executing the following query in your BigQuery project:

CALL carto.SETUP("""{
  "connection": "{CONNECTION}",
  "endpoint": "{ENDPOINT}",
  "api_base_url": "{API_BASE_URL}",
  "api_access_token": "{API_ACCESS_TOKEN}"
}'""");

Replace the following:

  • CONNECTION: name of the connection created in the previous step. The default value is {PROJECT_ID}.{REGION}.carto-conn

  • ENDPOINT: endpoint of the AT Gateway function deployed in Cloud Run

  • API_ACCESS_TOKEN: access token generated inside CARTO platform with permissions to use the LDS API

The ENDPOINT expected value can be obtained executing the following command:

gcloud run services describe carto-at-gateway
--project={PROJECT_ID}
--region={REGION}
--format="value(status.address.url)"

After running the previous query, the CARTO Analytics Toolbox should be ready to work in your BigQuery project. In order to check if the installation process has worked as expected, you can execute the following queries in the BigQuery console. It will create a table called geocode_test_table containing a gecoded address.

CREATE TABLE {DATASET}.geocode_test_table AS (
  SELECT "Madrid" AS address
)

CALL carto.GEOCODE_TABLE(NULL,NULL,'{PROJECT}.{DATASET}.geocode_test_table','address',NULL, NULL, NULL);

REGION: your connection region. US and EU regions are not available, so you'll have to select a more specific GCP region. You can check the list of available regions

The BQ connection created in the previous step will have to a Cloud Run service to use the AT. This service is the AT Gateway container, and prior to creating the services we'll need to create a subnetwork for it, as we'll have to use a for them:

API_BASE_URL: the of your CARTO Self-hosted platform.

Congratulations! If the previous query execution finishes and you can obtain the geocoded address querying the table that has been created, your CARTO Analytics Toolbox is successfully installed and configured inside your VPC.

Now, remember to setup your connections to BigQuery with the correct setting to ensure that all queries generated by CARTO applications use it.

🎉
install the Analytics Toolbox in a BigQuery project of your own
BigQuery Remote Functions
BigQuery connection
here
VPC Access Connector
API base URL
Analytics Toolbox location