LogoLogo
HomeAcademyLoginTry for free
  • Welcome
  • What's new
    • Q2 2025
    • Q1 2025
    • Q4 2024
    • Q3 2024
    • Q2 2024
    • Q1 2024
    • Q4 2023
    • Q3 2023
    • Q2 2023
    • Q1 2023
    • Q4 2022
    • Q3 2022
  • FAQs
    • Accounts
    • Migration to the new platform
    • User & organization setup
    • General
    • Builder
    • Workflows
    • Data Observatory
    • Analytics Toolbox
    • Development Tools
    • Deployment Options
    • CARTO Basemaps
    • CARTO for Education
    • Support Packages
    • Security and Compliance
  • Getting started
    • What is CARTO?
    • Quickstart guides
      • Connecting to your data
      • Creating your first map
      • Creating your first workflow
      • Developing your first application
    • CARTO Academy
  • CARTO User Manual
    • Overview
      • Creating your CARTO organization
      • CARTO Cloud Regions
      • CARTO Workspace overview
    • Maps
      • Data sources
        • Simple features
        • Spatial Indexes
        • Pre-generated tilesets
        • Rasters
        • Defining source spatial data
        • Managing data freshness
        • Changing data source location
      • Layers
        • Point
          • Grid point aggregation
          • H3 point aggregation
          • Heatmap point aggregation
          • Cluster point aggregation
        • Polygon
        • Line
        • Grid
        • H3
        • Raster
        • Zoom to layer
      • Widgets
        • Formula widget
        • Category widget
        • Pie widget
        • Histogram widget
        • Range widget
        • Time Series widget
        • Table widget
      • SQL Parameters
        • Date parameter
        • Text parameter
        • Numeric parameter
        • Publishing SQL parameters
      • Interactions
      • Legend
      • Basemaps
        • Basemap selector
      • AI Agents
      • SQL analyses
      • Map view modes
      • Map description
      • Feature selection tool
      • Search locations
      • Measure distances
      • Exporting data
      • Download PDF reports
      • Managing maps
      • Publishing and sharing maps
        • Map settings for viewers
        • Map preview for editors
        • Collaborative maps
        • Embedding maps
        • URL parameters
      • Performance considerations
    • Workflows
      • Workflow canvas
      • Results panel
      • Components
        • Aggregation
        • Custom
        • Data Enrichment
        • Data Preparation
        • Generative AI
        • Input / Output
        • Joins
        • Parsers
        • Raster Operations
        • Spatial Accessors
        • Spatial Analysis
        • Spatial Constructors
        • Spatial Indexes
        • Spatial Operations
        • Statistics
        • Tileset Creation
        • BigQuery ML
        • Snowflake ML
        • Google Earth Engine
        • Google Environment APIs
        • Telco Signal Propagation Models
      • Data Sources
      • Scheduling workflows
      • Sharing workflows
      • Using variables in workflows
      • Executing workflows via API
      • Temporary data in Workflows
      • Extension Packages
      • Managing workflows
      • Workflows best practices
    • Data Explorer
      • Creating a map from your data
      • Importing data
        • Importing rasters
      • Geocoding data
      • Optimizing your data
    • Data Observatory
      • Terminology
      • Browsing the Spatial Data Catalog
      • Subscribing to public and premium datasets
      • Accessing free data samples
      • Managing your subscriptions
      • Accessing your subscriptions from your data warehouse
        • Access data in BigQuery
        • Access data in Snowflake
        • Access data in Databricks
        • Access data in Redshift
        • Access data in PostgreSQL
    • Connections
      • Google BigQuery
      • Snowflake
      • Databricks
      • Amazon Redshift
      • PostgreSQL
      • CARTO Data Warehouse
      • Sharing connections
      • Deleting a connection
      • Required permissions
      • IP whitelisting
      • Customer data responsibilities
    • Applications
    • Settings
      • Understanding your organization quotas
      • Activity Data
        • Activity Data Reference
        • Activity Data Examples
        • Activity Data Changelog
      • Users and Groups
        • Inviting users to your organization
        • Managing user roles
        • Deleting users
        • SSO
        • Groups
        • Mapping groups to user roles
      • CARTO Support Access
      • Customizations
        • Customizing appearance and branding
        • Configuring custom color palettes
        • Configuring your organization basemaps
        • Enabling AI Agents
      • Advanced Settings
        • Managing applications
        • Configuring S3 Bucket for Redshift Imports
        • Configuring OAuth connections to Snowflake
        • Configuring OAuth U2M connections to Databricks
        • Configuring S3 Bucket integration for RDS for PostgreSQL Exports in Builder
        • Configuring Workload Identity Federation for BigQuery
      • Data Observatory
      • Deleting your organization
    • Developers
      • Managing Credentials
        • API Base URL
        • API Access Tokens
        • SPA OAuth Clients
        • M2M OAuth Clients
      • Named Sources
  • Data and Analysis
    • Analytics Toolbox Overview
    • Analytics Toolbox for BigQuery
      • Getting access
        • Projects maintained by CARTO in different BigQuery regions
        • Manual installation in your own project
        • Installation in a Google Cloud VPC
        • Core module
      • Key concepts
        • Tilesets
        • Spatial indexes
      • SQL Reference
        • accessors
        • clustering
        • constructors
        • cpg
        • data
        • http_request
        • import
        • geohash
        • h3
        • lds
        • measurements
        • placekey
        • processing
        • quadbin
        • random
        • raster
        • retail
        • routing
        • s2
        • statistics
        • telco
        • tiler
        • transformations
      • Guides
        • Running queries from Builder
        • Working with Raster data
      • Release notes
      • About Analytics Toolbox regions
    • Analytics Toolbox for Snowflake
      • Getting access
        • Native App from Snowflake's Marketplace
        • Manual installation
      • Key concepts
        • Spatial indexes
        • Tilesets
      • SQL Reference
        • accessors
        • clustering
        • constructors
        • data
        • http_request
        • import
        • h3
        • lds
        • measurements
        • placekey
        • processing
        • quadbin
        • random
        • raster
        • retail
        • s2
        • statistics
        • tiler
        • transformations
      • Guides
        • Running queries from Builder
        • Working with Raster data
      • Release Notes
    • Analytics Toolbox for Databricks
      • Getting access
        • Personal (former Single User) cluster
        • Standard (former Shared) cluster
      • Reference
        • lds
        • tiler
      • Guides
      • Release Notes
    • Analytics Toolbox for Redshift
      • Getting access
        • Manual installation in your database
        • Installation in an Amazon Web Services VPC
        • Core version
      • Key concepts
        • Tilesets
        • Spatial indexes
      • SQL Reference
        • clustering
        • constructors
        • data
        • http_request
        • import
        • lds
        • placekey
        • processing
        • quadbin
        • random
        • s2
        • statistics
        • tiler
        • transformations
      • Guides
        • Running queries from Builder
      • Release Notes
    • Analytics Toolbox for PostgreSQL
      • Getting access
        • Manual installation
        • Core version
      • Key concepts
        • Tilesets
        • Spatial Indexes
      • SQL Reference
        • h3
        • quadbin
        • tiler
      • Guides
        • Creating spatial index tilesets
        • Running queries from Builder
      • Release Notes
    • CARTO + Python
      • Installation
      • Authentication Methods
      • Visualizing Data
      • Working with Data
        • How to work with your data in the CARTO Data Warehouse
        • How to access your Data Observatory subscriptions
        • How to access CARTO's Analytics Toolbox for BigQuery and create visualizations via Python notebooks
        • How to access CARTO’s Analytics Toolbox for Snowflake and create visualizations via Python notebooks
        • How to visualize data from Databricks
      • Reference
    • CARTO QGIS Plugin
  • CARTO for Developers
    • Overview
    • Key concepts
      • Architecture
      • Libraries and APIs
      • Authentication methods
        • API Access Tokens
        • OAuth Access Tokens
        • OAuth Clients
      • Connections
      • Data sources
      • Visualization with deck.gl
        • Basemaps
          • CARTO Basemap
          • Google Maps
            • Examples
              • Gallery
              • Getting Started
              • Basic Examples
                • Hello World
                • BigQuery Tileset Layer
                • Data Observatory Tileset Layer
              • Advanced Examples
                • Arc Layer
                • Extrusion
                • Trips Layer
            • What's New
          • Amazon Location
            • Examples
              • Hello World
              • CartoLayer
            • What's New
        • Rapid Map Prototyping
      • Charts and widgets
      • Filtering and interactivity
      • Summary
    • Quickstart
      • Make your first API call
      • Visualize your first dataset
      • Create your first widget
    • Guides
      • Build a public application
      • Build a private application
      • Build a private application using SSO
      • Visualize massive datasets
      • Integrate CARTO in your existing application
      • Use Boundaries in your application
      • Avoid exposing SQL queries with Named Sources
      • Managing cache in your CARTO applications
    • Reference
      • Deck (@deck.gl reference)
      • Data Sources
        • vectorTableSource
        • vectorQuerySource
        • vectorTilesetSource
        • h3TableSource
        • h3QuerySource
        • h3TilesetSource
        • quadbinTableSource
        • quadbinQuerySource
        • quadbinTilesetSource
        • rasterSource
        • boundaryTableSource
        • boundaryQuerySource
      • Layers (@deck.gl/carto)
      • Widgets
        • Data Sources
        • Server-side vs. client-side
        • Models
          • getFormula
          • getCategories
          • getHistogram
          • getRange
          • getScatter
          • getTimeSeries
          • getTable
      • Filters
        • Column filters
        • Spatial filters
      • CARTO APIs Reference
    • Release Notes
    • Examples
    • CARTO for React
      • Guides
        • Getting Started
        • Views
        • Data Sources
        • Layers
        • Widgets
        • Authentication and Authorization
        • Basemaps
        • Look and Feel
        • Query Parameters
        • Code Generator
        • Sample Applications
        • Deployment
        • Upgrade Guide
      • Examples
      • Library Reference
        • Introduction
        • API
        • Auth
        • Basemaps
        • Core
        • Redux
        • UI
        • Widgets
      • Release Notes
  • CARTO Self-Hosted
    • Overview
    • Key concepts
      • Architecture
      • Deployment requirements
    • Quickstarts
      • Single VM deployment (Kots)
      • Orchestrated container deployment (Kots)
      • Advanced Orchestrated container deployment (Helm)
    • Guides
      • Guides (Kots)
        • Configure your own buckets
        • Configure an external in-memory cache
        • Enable Google Basemaps
        • Enable the CARTO Data Warehouse
        • Configure an external proxy
        • Enable BigQuery OAuth connections
        • Configure Single Sign-On (SSO)
        • Use Workload Identity in GCP
        • High availability configuration for CARTO Self-hosted
        • Configure your custom service account
      • Guides (Helm)
        • Configure your own buckets (Helm)
        • Configure an external in-memory cache (Helm)
        • Enable Google Basemaps (Helm)
        • Enable the CARTO Data Warehouse (Helm)
        • Configure an external proxy (Helm)
        • Enable BigQuery OAuth connections (Helm)
        • Configure Single Sign-On (SSO) (Helm)
        • Use Workload Identity in GCP (Helm)
        • Use EKS Pod Identity in AWS (Helm)
        • Enable Redshift imports (Helm)
        • Migrating CARTO Self-hosted installation to an external database (Helm)
        • Advanced customizations (Helm)
        • Configure your custom service account (Helm)
    • Maintenance
      • Maintenance (Kots)
        • Updates
        • Backups
        • Uninstall
        • Rotating keys
        • Monitoring
        • Change the Admin Console password
      • Maintenance (Helm)
        • Monitoring (Helm)
        • Rotating keys (Helm)
        • Uninstall (Helm)
        • Backups (Helm)
        • Updates (Helm)
    • Support
      • Get debug information for Support (Kots)
      • Get debug information for Support (Helm)
    • CARTO Self-hosted Legacy
      • Key concepts
        • Architecture
        • Deployment requirements
      • Quickstarts
        • Single VM deployment (docker-compose)
      • Guides
        • Configure your own buckets
        • Configure an external in-memory cache
        • Enable Google Basemaps
        • Enable the CARTO Data Warehouse
        • Configure an external proxy
        • Enable BigQuery OAuth connections
        • Configure Single Sign-On (SSO)
        • Enable Redshift imports
        • Configure your custom service account
        • Advanced customizations
        • Migrating CARTO Self-Hosted installation to an external database
      • Maintenance
        • Updates
        • Backups
        • Uninstall
        • Rotating keys
        • Monitoring
      • Support
    • Release Notes
  • CARTO Native App for Snowflake Containers
    • Deploying CARTO using Snowflake Container Services
  • Get Help
    • Legal & Compliance
    • Previous libraries and components
    • Migrating your content to the new CARTO platform
Powered by GitBook
On this page
  • Intersect and aggregate
  • Add column from second source
  • Filter by column value
  • Create buffers
  • Compute centroids
  • K-Means Clustering
  • Trade Areas

Was this helpful?

Export as PDF
  1. CARTO User Manual
  2. Maps

SQL analyses

PreviousAI AgentsNextMap view modes

Last updated 1 year ago

Was this helpful?

CARTO Builder offers a UI that helps building SQL queries to perform geospatial analysis operations.

To get started with SQL Analysis, add a source to your Builder map and click on the three dots to find the Add SQL Analysis option:

After that, you will see a list of analyses compatible with your source. Compatibility depends on some factors, like whether or not the CARTO Analytics Toolbox is needed, or the connection’s cloud data warehouse.

Check this table to find out which analyses are available for each data warehouse:

BigQuery
CARTO DW
Redshift
Snowflake
PostgreSQL

Intersect and Aggregate

✅

✅

✅

✅

✅

Create buffers

✅

✅

✅

✅ (*)

✅

Add column from second source

✅

✅

✅

✅

✅

Filter by column value

✅

✅

✅

✅

✅

Calculate Centroids

✅

✅

✅

✅

✅

Clustering K-Means

✅ (*)

✅

✅ (*)

✅ (*)

✅

Trade Areas

✅ (*)

✅

✅ (*)

✅ (*)

(*) Requires the CARTO Analytics Toolbox to be installed


The resulting SQL queries from each analysis will take into account the syntax, specific functions and other nuances between different cloud data warehouses.

The resulting SQL query can be loaded in the map in different ways:

  • Run SQL analysis will load the query as a SQL Query source in Builder immediately.

  • Preview SQL analysis query will load the query in the SQL panel in Builder, so you will be able to review and modify the query before running it.

  • Save results in a new table will let you select the destination of a table that will contain the result of the query. The table will then be loaded as a source to the map.

The following are the currently available SQL analysis:

Intersect and aggregate

This analysis allows to perform a geospatial intersection between two different sources, aggregating data from the second source into the base one, when geometries from the second source intersect with geometries in the base source.

The result of this SQL analysis includes all columns from the base source and an extra one, called agg_value, that contains the aggregated data from the features in the second source that intersect with each row in the base source.

Parameters

  • Second source: pick an existing source from your map, or a table from your cloud data warehouse to be used as the second source for this analysis

  • Aggregation operation: select the operation to aggregate the data from the second source.

  • Aggregation column: select the column from the second source that will be aggregated.

Example

A very common use case for this SQL Analysis would be “Get the average revenue from all stores in each neighborhood”. In this hypothetical scenario:

  • a table containing the polygon geometries for each neighborhood would be the base source.

  • a table containing the point geometries of each store would be the second source.

  • the aggregation operation would be the average (AVG).

  • the aggregation column would be the one that contains the revenue in the stores table.

Add column from second source

This analysis allows creating a LEFT JOIN SQL query, allowing to include columns from both base and second source to be included in the result.

Parameters

  • Key columns: For each source, a key column needs to be selected. This column will be used to join the rows from the base and second source that share the same value.

  • Columns to be included in the result: Select the columns from each source that will be included in the result.

Filter by column value

This analysis allows to keep or discard rows based on a column value.

Parameters

  • Target column: Select the column that we’ll be used for the filter.

  • Filter operator: Select a type of filter from the list of available operators.

  • Values: Use the selector to configure your filter.

Create buffers

This analysis creates a distance buffer around your existing geometries. It works with points, lines and polygons, and the resulting geometry will always be polygons.

Parameters

  • Distance: Select the distance that will be used to create the buffers.

  • Tracts: Select the number of concentric buffers that will be created.

  • Individual/Combined result: Select between having an individual buffer created for each row, or combine them all in a single polygon.

Compute centroids

This analysis will produce a point that represent the centroid of the geometries in your source. By default, it will produce a single point. Using the Categorize optional parameter we can get a centroid per category in the dataset.

Parameters

  • Categorize: Select a column that contains categories to create one centroid per category in your dataset.

  • Aggregation: Aggregate data from the original dataset into the resulting centroids. The result of the analysis will include a column aggregated_value that contains the value of the aggregation.

    • Aggregation Operation: Select an aggregation operation from the list.

    • Aggregation Column: Select a column to be aggregated.

K-Means Clustering

Parameters

  • Number of clusters: Define the number of clusters that will be produced by the analysis.

This analysis can be performed safely with up to ~700K rows. Bigger sources can cause the resulting SQL query to hit some limits BigQuery. Due to a Redshift internal limitation, the query produced by this analysis includes a LIMIT 100 clause. Removing this limit might cause an error on the query execution.

Trade Areas

The input source for this analysis should contain point geometries that will be taken as the origin point for the isoline generation.

This SQL Analysis is available for BigQuery, Snowflake and Redshift connections, and it requires a specific minimum version of the CARTO Analytics Toolbox Advanced module to be installed:

  • Snowflake: 2022.06.09

  • Redshift: 2022.06.07

The result from this analysis can only be saved as a new table.

Parameters

  • Mode: Define the transportation mode that will be used for the isoline computation.

  • Range Type: Define the type of range that will be be used for the isoline computation:

    • Distance: The resulting isoline will describe the area that can be covered by traveling a specific distance set in meters.

    • Time: The resulting isoline will describe the area that can be covered by traveling during a specific time set in seconds.

Each analysis will create a SQL query that performs the geospatial operation. These SQL queries will use to be able to chain different analyses and create a more complex sequence.

This analysis uses the function from the CARTO Analytics Toolbox for BigQuery, or the in PostGIS, taking a set of points and finding a defined number of clusters based on the k-means algorithm. It generates a cluster_no column that indicates the cluster that each point belongs to.

This analysis leverages the CREATE_ISOLINES function in the CARTO Analytics Toolbox for , and to generate time or distance isolines based on different modes of transportation.

CTEs (Common Table Expressions)
ST_CLUSTERKMEANS
ST_ClusterKMeans()
Snowflake
Redshift
BigQuery