LogoLogo
HomeAcademyLoginTry for free
  • Welcome
  • What's new
    • Q2 2025
    • Q1 2025
    • Q4 2024
    • Q3 2024
    • Q2 2024
    • Q1 2024
    • Q4 2023
    • Q3 2023
    • Q2 2023
    • Q1 2023
    • Q4 2022
    • Q3 2022
  • FAQs
    • Accounts
    • Migration to the new platform
    • User & organization setup
    • General
    • Builder
    • Workflows
    • Data Observatory
    • Analytics Toolbox
    • Development Tools
    • Deployment Options
    • CARTO Basemaps
    • CARTO for Education
    • Support Packages
    • Security and Compliance
  • Getting started
    • What is CARTO?
    • Quickstart guides
      • Connecting to your data
      • Creating your first map
      • Creating your first workflow
      • Developing your first application
    • CARTO Academy
  • CARTO User Manual
    • Overview
      • Creating your CARTO organization
      • CARTO Cloud Regions
      • CARTO Workspace overview
    • Maps
      • Data sources
        • Simple features
        • Spatial Indexes
        • Pre-generated tilesets
        • Rasters
        • Defining source spatial data
        • Managing data freshness
        • Changing data source location
      • Layers
        • Point
          • Grid point aggregation
          • H3 point aggregation
          • Heatmap point aggregation
          • Cluster point aggregation
        • Polygon
        • Line
        • Grid
        • H3
        • Raster
        • Zoom to layer
      • Widgets
        • Formula widget
        • Category widget
        • Pie widget
        • Histogram widget
        • Range widget
        • Time Series widget
        • Table widget
      • SQL Parameters
        • Date parameter
        • Text parameter
        • Numeric parameter
        • Publishing SQL parameters
      • Interactions
      • Legend
      • Basemaps
        • Basemap selector
      • AI Agents
      • SQL analyses
      • Map view modes
      • Map description
      • Feature selection tool
      • Search locations
      • Measure distances
      • Exporting data
      • Download PDF reports
      • Managing maps
      • Sharing and collaboration
        • Editor collaboration
        • Map preview for editors
        • Map settings for viewers
        • Comments
        • Embedding maps
        • URL parameters
      • Performance considerations
    • Workflows
      • Workflow canvas
      • Results panel
      • Components
        • Aggregation
        • Custom
        • Data Enrichment
        • Data Preparation
        • Generative AI
        • Input / Output
        • Joins
        • Parsers
        • Raster Operations
        • Spatial Accessors
        • Spatial Analysis
        • Spatial Constructors
        • Spatial Indexes
        • Spatial Operations
        • Statistics
        • Tileset Creation
        • BigQuery ML
        • Snowflake ML
        • Google Earth Engine
        • Google Environment APIs
        • Telco Signal Propagation Models
      • Data Sources
      • Scheduling workflows
      • Sharing workflows
      • Using variables in workflows
      • Executing workflows via API
      • Temporary data in Workflows
      • Extension Packages
      • Managing workflows
      • Workflows best practices
    • Data Explorer
      • Creating a map from your data
      • Importing data
        • Importing rasters
      • Geocoding data
      • Optimizing your data
    • Data Observatory
      • Terminology
      • Browsing the Spatial Data Catalog
      • Subscribing to public and premium datasets
      • Accessing free data samples
      • Managing your subscriptions
      • Accessing your subscriptions from your data warehouse
        • Access data in BigQuery
        • Access data in Snowflake
        • Access data in Databricks
        • Access data in Redshift
        • Access data in PostgreSQL
    • Connections
      • Google BigQuery
      • Snowflake
      • Databricks
      • Amazon Redshift
      • PostgreSQL
      • CARTO Data Warehouse
      • Sharing connections
      • Deleting a connection
      • Required permissions
      • IP whitelisting
      • Customer data responsibilities
    • Applications
    • Settings
      • Understanding your organization quotas
      • Activity Data
        • Activity Data Reference
        • Activity Data Examples
        • Activity Data Changelog
      • Users and Groups
        • Inviting users to your organization
        • Managing user roles
        • Deleting users
        • SSO
        • Groups
        • Mapping groups to user roles
      • CARTO Support Access
      • Customizations
        • Customizing appearance and branding
        • Configuring custom color palettes
        • Configuring your organization basemaps
        • Enabling AI Agents
      • Advanced Settings
        • Managing applications
        • Configuring S3 Bucket for Redshift Imports
        • Configuring OAuth connections to Snowflake
        • Configuring OAuth U2M connections to Databricks
        • Configuring S3 Bucket integration for RDS for PostgreSQL Exports in Builder
        • Configuring Workload Identity Federation for BigQuery
      • Data Observatory
      • Deleting your organization
    • Developers
      • Managing Credentials
        • API Base URL
        • API Access Tokens
        • SPA OAuth Clients
        • M2M OAuth Clients
      • Named Sources
  • Data and Analysis
    • Analytics Toolbox Overview
    • Analytics Toolbox for BigQuery
      • Getting access
        • Projects maintained by CARTO in different BigQuery regions
        • Manual installation in your own project
        • Installation in a Google Cloud VPC
        • Core module
      • Key concepts
        • Tilesets
        • Spatial indexes
      • SQL Reference
        • accessors
        • clustering
        • constructors
        • cpg
        • data
        • http_request
        • import
        • geohash
        • h3
        • lds
        • measurements
        • placekey
        • processing
        • quadbin
        • random
        • raster
        • retail
        • routing
        • s2
        • statistics
        • telco
        • tiler
        • transformations
      • Guides
        • Running queries from Builder
        • Working with Raster data
      • Release notes
      • About Analytics Toolbox regions
    • Analytics Toolbox for Snowflake
      • Getting access
        • Native App from Snowflake's Marketplace
        • Manual installation
      • Key concepts
        • Spatial indexes
        • Tilesets
      • SQL Reference
        • accessors
        • clustering
        • constructors
        • data
        • http_request
        • import
        • h3
        • lds
        • measurements
        • placekey
        • processing
        • quadbin
        • random
        • raster
        • retail
        • s2
        • statistics
        • tiler
        • transformations
      • Guides
        • Running queries from Builder
        • Working with Raster data
      • Release Notes
    • Analytics Toolbox for Databricks
      • Getting access
        • Personal (former Single User) cluster
        • Standard (former Shared) cluster
      • Reference
        • lds
        • tiler
      • Guides
      • Release Notes
    • Analytics Toolbox for Redshift
      • Getting access
        • Manual installation in your database
        • Installation in an Amazon Web Services VPC
        • Core version
      • Key concepts
        • Tilesets
        • Spatial indexes
      • SQL Reference
        • clustering
        • constructors
        • data
        • http_request
        • import
        • lds
        • placekey
        • processing
        • quadbin
        • random
        • s2
        • statistics
        • tiler
        • transformations
      • Guides
        • Running queries from Builder
      • Release Notes
    • Analytics Toolbox for PostgreSQL
      • Getting access
        • Manual installation
        • Core version
      • Key concepts
        • Tilesets
        • Spatial Indexes
      • SQL Reference
        • h3
        • quadbin
        • tiler
      • Guides
        • Creating spatial index tilesets
        • Running queries from Builder
      • Release Notes
    • CARTO + Python
      • Installation
      • Authentication Methods
      • Visualizing Data
      • Working with Data
        • How to work with your data in the CARTO Data Warehouse
        • How to access your Data Observatory subscriptions
        • How to access CARTO's Analytics Toolbox for BigQuery and create visualizations via Python notebooks
        • How to access CARTO’s Analytics Toolbox for Snowflake and create visualizations via Python notebooks
        • How to visualize data from Databricks
      • Reference
    • CARTO QGIS Plugin
  • CARTO for Developers
    • Overview
    • Key concepts
      • Architecture
      • Libraries and APIs
      • Authentication methods
        • API Access Tokens
        • OAuth Access Tokens
        • OAuth Clients
      • Connections
      • Data sources
      • Visualization with deck.gl
        • Basemaps
          • CARTO Basemap
          • Google Maps
            • Examples
              • Gallery
              • Getting Started
              • Basic Examples
                • Hello World
                • BigQuery Tileset Layer
                • Data Observatory Tileset Layer
              • Advanced Examples
                • Arc Layer
                • Extrusion
                • Trips Layer
            • What's New
          • Amazon Location
            • Examples
              • Hello World
              • CartoLayer
            • What's New
        • Rapid Map Prototyping
      • Charts and widgets
      • Filtering and interactivity
      • Summary
    • Quickstart
      • Make your first API call
      • Visualize your first dataset
      • Create your first widget
    • Guides
      • Build a public application
      • Build a private application
      • Build a private application using SSO
      • Visualize massive datasets
      • Integrate CARTO in your existing application
      • Use Boundaries in your application
      • Avoid exposing SQL queries with Named Sources
      • Managing cache in your CARTO applications
    • Reference
      • Deck (@deck.gl reference)
      • Data Sources
        • vectorTableSource
        • vectorQuerySource
        • vectorTilesetSource
        • h3TableSource
        • h3QuerySource
        • h3TilesetSource
        • quadbinTableSource
        • quadbinQuerySource
        • quadbinTilesetSource
        • rasterSource
        • boundaryTableSource
        • boundaryQuerySource
      • Layers (@deck.gl/carto)
      • Widgets
        • Data Sources
        • Server-side vs. client-side
        • Models
          • getFormula
          • getCategories
          • getHistogram
          • getRange
          • getScatter
          • getTimeSeries
          • getTable
      • Filters
        • Column filters
        • Spatial filters
      • CARTO APIs Reference
    • Release Notes
    • Examples
    • CARTO for React
      • Guides
        • Getting Started
        • Views
        • Data Sources
        • Layers
        • Widgets
        • Authentication and Authorization
        • Basemaps
        • Look and Feel
        • Query Parameters
        • Code Generator
        • Sample Applications
        • Deployment
        • Upgrade Guide
      • Examples
      • Library Reference
        • Introduction
        • API
        • Auth
        • Basemaps
        • Core
        • Redux
        • UI
        • Widgets
      • Release Notes
  • CARTO Self-Hosted
    • Overview
    • Key concepts
      • Architecture
      • Deployment requirements
    • Quickstarts
      • Single VM deployment (Kots)
      • Orchestrated container deployment (Kots)
      • Advanced Orchestrated container deployment (Helm)
    • Guides
      • Guides (Kots)
        • Configure your own buckets
        • Configure an external in-memory cache
        • Enable Google Basemaps
        • Enable the CARTO Data Warehouse
        • Configure an external proxy
        • Enable BigQuery OAuth connections
        • Configure Single Sign-On (SSO)
        • Use Workload Identity in GCP
        • High availability configuration for CARTO Self-hosted
        • Configure your custom service account
      • Guides (Helm)
        • Configure your own buckets (Helm)
        • Configure an external in-memory cache (Helm)
        • Enable Google Basemaps (Helm)
        • Enable the CARTO Data Warehouse (Helm)
        • Configure an external proxy (Helm)
        • Enable BigQuery OAuth connections (Helm)
        • Configure Single Sign-On (SSO) (Helm)
        • Use Workload Identity in GCP (Helm)
        • Use EKS Pod Identity in AWS (Helm)
        • Enable Redshift imports (Helm)
        • Migrating CARTO Self-hosted installation to an external database (Helm)
        • Advanced customizations (Helm)
        • Configure your custom service account (Helm)
    • Maintenance
      • Maintenance (Kots)
        • Updates
        • Backups
        • Uninstall
        • Rotating keys
        • Monitoring
        • Change the Admin Console password
      • Maintenance (Helm)
        • Monitoring (Helm)
        • Rotating keys (Helm)
        • Uninstall (Helm)
        • Backups (Helm)
        • Updates (Helm)
    • Support
      • Get debug information for Support (Kots)
      • Get debug information for Support (Helm)
    • CARTO Self-hosted Legacy
      • Key concepts
        • Architecture
        • Deployment requirements
      • Quickstarts
        • Single VM deployment (docker-compose)
      • Guides
        • Configure your own buckets
        • Configure an external in-memory cache
        • Enable Google Basemaps
        • Enable the CARTO Data Warehouse
        • Configure an external proxy
        • Enable BigQuery OAuth connections
        • Configure Single Sign-On (SSO)
        • Enable Redshift imports
        • Configure your custom service account
        • Advanced customizations
        • Migrating CARTO Self-Hosted installation to an external database
      • Maintenance
        • Updates
        • Backups
        • Uninstall
        • Rotating keys
        • Monitoring
      • Support
    • Release Notes
  • CARTO Native App for Snowflake Containers
    • Deploying CARTO using Snowflake Container Services
  • Get Help
    • Legal & Compliance
    • Previous libraries and components
    • Migrating your content to the new CARTO platform
Powered by GitBook
On this page
  • Setup
  • Prepare the resources
  • Creating the dataset
  • Installation
  • Prepare the package
  • Create catalog tables and variables
  • Copy the libraries
  • Create the functions and procedures
  • AT Gateway
  • Create a BigQuery connection
  • Find an AT Gateway endpoint
  • Find CARTO LDS API connection details
  • Run the SETUP procedure

Was this helpful?

Export as PDF
  1. Data and Analysis
  2. Analytics Toolbox for BigQuery
  3. Getting access

Manual installation in your own project

PreviousProjects maintained by CARTO in different BigQuery regionsNextInstallation in a Google Cloud VPC

Last updated 2 months ago

Was this helpful?

The manual installation of the Analytics Toolbox is recommended to support other regions required for your project or make the toolbox available inside a Virtual Private Cloud (VPC).

This guide will use Google Cloud Shell to set up and install the toolbox. Please open the and select the project to install the toolbox, then use the “>_” button (top right) to “Activate Cloud Shell”.

All following commands and instructions should be executed from the Cloud Shell in your console or from authenticated gcloudand bqCLI sessions.

Setup

This step consists of setting up the BigQuery project where you want to install the toolbox. A Google account is required.

Prepare the resources

You will need a GCP project to install the toolbox, as well as a storage bucket in the same project to store the JavasScript libraries needed. Users of the toolbox will need permission to read both the BigQuery dataset (where the functions and procedures will be installed) and the bucket in order to run the CARTO Analytics Toolbox.

  • PROJECT: Project id where the toolbox dataset will be created

  • REGION: Location of the BigQuery dataset that will be created to install the Analytics Toolbox

  • BUCKET: Name of the bucket to store the JavasScript libraries needed by the toolbox (please omit any protocol prefix like gs://)

Set these variables by executing the following in Cloud Shell (after replacing the appropriate values):

export PROJECT="<my-project>"
export REGION="<my-region>"
export BUCKET="<my-bucket>"

Creating the dataset

This step is only required before the first installation. Activate the Cloud Shell in the target project and ensure the environment variables from the preparation step above are set.

Before starting the process make sure the target GCP project exists and that it is the correct one by executing the following:

# Check project existence
gcloud projects describe $PROJECT

Then, create a BigQuery dataset named carto, where the Analytics Toolbox will be installed:

# Create dataset "carto"
bq mk --location=$REGION --description="CARTO dataset" -d $PROJECT:carto

Installation

Once the setup is completed, we can proceed with the installation of the toolbox. This step will be performed the first time and every time we want to install an updated version.

Prepare the package

The CARTO team will provide a zip that contains the scripts to install the analytics toolbox:

  • License file

  • modules.sql file

  • Libs directory with the JS files

# Make sure you use the correct file name
AT_PACKAGE=carto-analytics-toolbox-bigquery-latest.zip

# Uncompress package
unzip $AT_PACKAGE

# Enter the directory
cd $(unzip -Z -1 $AT_PACKAGE | head -1)

# Read the license
cat LICENSE

Create catalog tables and variables

The data module function uses the spatial catalog tables. Running the following commands, those tables are created in the carto dataset. When subscriptions are provisioned, the catalog tables will be updated.

# Create spatial catalog datasets
bq query --use_legacy_sql=false --project_id=$PROJECT 'CREATE TABLE IF NOT EXISTS carto.spatial_catalog_datasets(dataset_id string,dataset_slug string,dataset_name string,dataset_country string,dataset_category string,dataset_provider string,dataset_version string,dataset_geom_type string,dataset_is_public boolean,dataset_is_product boolean,associated_geography_id string);'

# Create spatial catalog variables
bq query --use_legacy_sql=false --project_id=$PROJECT 'CREATE TABLE IF NOT EXISTS carto.spatial_catalog_variables(variable_slug string,variable_name string,variable_description string,variable_type string,variable_aggregation string,dataset_slug string);'

Copy the libraries

This command installs the JavaScript libraries for the Analytics Toolbox in the selected bucket. These libraries will be used in runtime by the functions.

# Copy libs to bucket
gsutil -m cp -r libs/ gs://$BUCKET/carto/

Create the functions and procedures

This command installs the functions and procedures of the Analytics Toolbox in the selected project. This operation takes a few minutes to complete.

# Prepare SQL code
sed -e 's!@@BUCKET@@!'"$BUCKET"'!g' modules.sql > modules_rep.sql

# Create the functions and procedures
bq --location=$REGION --project_id=$PROJECT query --use_legacy_sql=false \
--max_statement_results=10000 --format=prettyjson < modules_rep.sql

The installation process might take up to 5 minutes.

Please be aware that this script will remove all the previous functions and procedures in the carto schema.

AT Gateway

  • Creation of isolines, geocoding and routing require making calls to CARTO LDS API. In order to call this API, BigQuery needs a connection and a Cloud Run function that works as proxy.

  • Some other functions of the Analytics Toolbox require making a request to the CARTO Platform backend. For this purpose, CARTO provides Cloud Run functions on different regions.

  • Select an AT Gateway endpoint that is in the same region as your project.

Create a BigQuery connection

As mentioned above, BigQuery connections are used for running remote functions from BigQuery.

# Create the connection
bq mk --connection --display_name='carto-conn' \
--connection_type=CLOUD_RESOURCE --project_id=$PROJECT \
--location=$REGION carto-conn

Export the CONNECTIONvariable:

export CONNECTION="$PROJECT.$REGION.carto-conn"

Find an AT Gateway endpoint

From this table, find the AT Gateway endpoint that correspond with your region. For US and EU multi-regions, choose any US or EU endpoint.

Project
Service
Region
URL

carto-analytics-toolbox

atgwfunc-as-ea1-7604

asia-east1

carto-analytics-toolbox

atgwfunc-as-ea2-9555

asia-east2

carto-analytics-toolbox

atgwfunc-as-ne1-7871

asia-northeast1

carto-analytics-toolbox

atgwfunc-as-ne2-3699

asia-northeast2

carto-analytics-toolbox

atgwfunc-as-ne3-9803

asia-northeast3

carto-analytics-toolbox

atgwfunc-as-se1-9008

asia-southeast1

carto-analytics-toolbox

atgwfunc-as-se2-9937

asia-southeast2

carto-analytics-toolbox

atgwfunc-as-so1-7788

asia-south1

carto-analytics-toolbox

atgwfunc-as-so2-6443

asia-south2

carto-analytics-toolbox

atgwfunc-au-se1-2301

australia-southeast1

carto-analytics-toolbox

atgwfunc-au-se2-9274

australia-southeast2

carto-analytics-toolbox

atgwfunc-eu-ce2-6914

europe-central2

carto-analytics-toolbox

atgwfunc-eu-no1-6484

europe-north1

carto-analytics-toolbox

atgwfunc-eu-sw1-1259

europe-southwest1

carto-analytics-toolbox

atgwfunc-eu-we1-1809

europe-west1

carto-analytics-toolbox

atgwfunc-eu-we12-743

europe-west12

carto-analytics-toolbox

atgwfunc-eu-we2-9846

europe-west2

carto-analytics-toolbox

atgwfunc-eu-we3-9705

europe-west3

carto-analytics-toolbox

atgwfunc-eu-we4-4558

europe-west4

carto-analytics-toolbox

atgwfunc-eu-we6-5628

europe-west6

carto-analytics-toolbox

atgwfunc-eu-we9-8712

europe-west9

carto-analytics-toolbox

atgwfunc-me-ce1-6804

me-central1

carto-analytics-toolbox

atgwfunc-me-we1-8177

me-west1

carto-analytics-toolbox

atgwfunc-na-ne1-9070

northamerica-northeast1

carto-analytics-toolbox

atgwfunc-na-ne2-1664

northamerica-northeast2

carto-analytics-toolbox

atgwfunc-sa-ea1-5299

southamerica-east1

carto-analytics-toolbox

atgwfunc-sa-we1-1940

southamerica-west1

carto-analytics-toolbox

atgwfunc-us-ce1-4566

us-central1

carto-analytics-toolbox

atgwfunc-us-ea1-8111

us-east1

carto-analytics-toolbox

atgwfunc-us-ea4-8568

us-east4

carto-analytics-toolbox

atgwfunc-us-we1-4218

us-west1

carto-analytics-toolbox

atgwfunc-us-we2-3390

us-west2

carto-analytics-toolbox

atgwfunc-us-we3-1834

us-west3

carto-analytics-toolbox

atgwfunc-us-we4-1057

us-west4

Export the GATEWAY_ENDPOINT variable:

export ENDPOINT="<gateway_endpoint>"

Find CARTO LDS API connection details

Let's export those values as variables:

export API_BASE_URL="<api-base-url>"
export API_ACCESS_TOKEN="<api-access-token>"

Run the SETUP procedure

Now that we have collected all the necessary parameters, we can call the SETUP procedure to have our own installation of the Analytics Toolbox ready and available:

CALL carto.SETUP("""{
  "connection": "{CONNECTION}",
  "endpoint": "{ENDPOINT}",
  "api_base_url": "{API_BASE_URL}",
  "api_access_token": "{API_ACCESS_TOKEN}"
}'""");

After running the previous query, the CARTO Analytics Toolbox should be ready to work in your BigQuery project. In order to check if the installation process has worked as expected, you can execute the following queries in the BigQuery console. It will create a table called geocode_test_table containing a gecoded address.

CREATE TABLE {DATASET}.geocode_test_table AS (
  SELECT "Madrid" AS address
)

CALL carto.GEOCODE_TABLE(NULL,NULL,'{PROJECT}.{DATASET}.geocode_test_table','address',NULL, NULL, NULL);

After an installation or update of the Analytics Toolbox is performed, the CARTO connection needs to refreshed by the owner of the connection by clicking on the refresh button on the connection's card.

We will set the project and bucket names as well as the where the toolbox will be created (which should be the same as the bucket) as Cloud Shell environment variables:

Please in order to get access to the installation package.

Take a look at if you need help uploading the package to your Cloud Shell. Once you're all set with that, run the following commands:

Some functionalities of the CARTO Analytics Toolbox for BigQuery require making external calls from BigQuery to CARTO services. These calls are implemented via :

These services are readily available when using the Analytics Toolbox from a , but when installing the Analytics Toolbox manually in your own project, there is some configuration required:

Create a that will allow to call Cloud Run functions from BigQuery.

If you are installing a CARTO Self Hosted in a Google Cloud VPC, please refer to to learn how to provision the AT Gateway in your VPC before proceeding with this part of the configuration.

Create a connection with the command below. Please note that permission will be required:

At this point, if you're installing the Analytics Toolbox in a Google Cloud VPC, please refer to .

In order to use the functions in the, you need to get the API Base URL and LDS Token from your CARTO account.

To get the API Base URL, go to the “Developers” section in the CARTO Platform and copy the value. For more information, .

To get the LDS Token, go to the “Developers” section and create a new API Access Token. For more information, . Make sure your token has LDS API enabled:

Congratulations! If the previous query execution finishes and you can obtain the geocoded address querying the table that has been created your installation is complete. Now, remember to setup your connections to BigQuery with the correct setting to ensure that all queries generated by CARTO applications use it.

location
reach out to our Support team
this documentation
BigQuery Remote Functions
CARTO maintained project
BigQuery connection
this section
bigquery.connections.delegate
this section
lds module
check the documentation
check the documentation
🎉
Analytics Toolbox location
https://atgwfunc-as-ea1-7604-cmc6qlhtda-de.a.run.app/
https://atgwfunc-as-ea2-9555-cmc6qlhtda-df.a.run.app/
https://atgwfunc-as-ne1-7871-cmc6qlhtda-an.a.run.app/
https://atgwfunc-as-ne2-3699-cmc6qlhtda-dt.a.run.app/
https://atgwfunc-as-ne3-9803-cmc6qlhtda-du.a.run.app/
https://atgwfunc-as-se1-9008-cmc6qlhtda-as.a.run.app/
https://atgwfunc-as-se2-9937-cmc6qlhtda-et.a.run.app/
https://atgwfunc-as-so1-7788-cmc6qlhtda-el.a.run.app/
https://atgwfunc-as-so2-6443-cmc6qlhtda-em.a.run.app/
https://atgwfunc-au-se1-2301-cmc6qlhtda-ts.a.run.app/
https://atgwfunc-au-se2-9274-cmc6qlhtda-km.a.run.app/
https://atgwfunc-eu-ce2-6914-cmc6qlhtda-lm.a.run.app/
https://atgwfunc-eu-no1-6484-cmc6qlhtda-lz.a.run.app/
https://atgwfunc-eu-sw1-1259-cmc6qlhtda-no.a.run.app/
https://atgwfunc-eu-we1-1809-cmc6qlhtda-ew.a.run.app/
https://atgwfunc-eu-we12-743-cmc6qlhtda-og.a.run.app/
https://atgwfunc-eu-we2-9846-cmc6qlhtda-nw.a.run.app/
https://atgwfunc-eu-we3-9705-cmc6qlhtda-ey.a.run.app/
https://atgwfunc-eu-we4-4558-cmc6qlhtda-ez.a.run.app/
https://atgwfunc-eu-we6-5628-cmc6qlhtda-oa.a.run.app/
https://atgwfunc-eu-we9-8712-cmc6qlhtda-od.a.run.app/
https://atgwfunc-me-ce1-6804-cmc6qlhtda-ww.a.run.app/
https://atgwfunc-me-we1-8177-cmc6qlhtda-zf.a.run.app/
https://atgwfunc-na-ne1-9070-cmc6qlhtda-nn.a.run.app/
https://atgwfunc-na-ne2-1664-cmc6qlhtda-pd.a.run.app/
https://atgwfunc-sa-ea1-5299-cmc6qlhtda-rj.a.run.app/
https://atgwfunc-sa-we1-1940-cmc6qlhtda-tl.a.run.app/
https://atgwfunc-us-ce1-4566-cmc6qlhtda-uc.a.run.app/
https://atgwfunc-us-ea1-8111-cmc6qlhtda-ue.a.run.app/
https://atgwfunc-us-ea4-8568-cmc6qlhtda-uk.a.run.app/
https://atgwfunc-us-we1-4218-cmc6qlhtda-uw.a.run.app/
https://atgwfunc-us-we2-3390-cmc6qlhtda-wl.a.run.app/
https://atgwfunc-us-we3-1834-cmc6qlhtda-wm.a.run.app/
https://atgwfunc-us-we4-1057-cmc6qlhtda-wn.a.run.app/
GCP console