LogoLogo
HomeAcademyLoginTry for free
  • Welcome
  • What's new
    • Q2 2025
    • Q1 2025
    • Q4 2024
    • Q3 2024
    • Q2 2024
    • Q1 2024
    • Q4 2023
    • Q3 2023
    • Q2 2023
    • Q1 2023
    • Q4 2022
    • Q3 2022
  • FAQs
    • Accounts
    • Migration to the new platform
    • User & organization setup
    • General
    • Builder
    • Workflows
    • Data Observatory
    • Analytics Toolbox
    • Development Tools
    • Deployment Options
    • CARTO Basemaps
    • CARTO for Education
    • Support Packages
    • Security and Compliance
  • Getting started
    • What is CARTO?
    • Quickstart guides
      • Connecting to your data
      • Creating your first map
      • Creating your first workflow
      • Developing your first application
    • CARTO Academy
  • CARTO User Manual
    • Overview
      • Creating your CARTO organization
      • CARTO Cloud Regions
      • CARTO Workspace overview
    • Maps
      • Data sources
        • Simple features
        • Spatial Indexes
        • Pre-generated tilesets
        • Rasters
        • Defining source spatial data
        • Managing data freshness
        • Changing data source location
      • Layers
        • Point
          • Grid point aggregation
          • H3 point aggregation
          • Heatmap point aggregation
          • Cluster point aggregation
        • Polygon
        • Line
        • Grid
        • H3
        • Raster
        • Zoom to layer
      • Widgets
        • Formula widget
        • Category widget
        • Pie widget
        • Histogram widget
        • Range widget
        • Time Series widget
        • Table widget
      • SQL Parameters
        • Date parameter
        • Text parameter
        • Numeric parameter
        • Publishing SQL parameters
      • Interactions
      • Legend
      • Basemaps
        • Basemap selector
      • AI Agents
      • SQL analyses
      • Map view modes
      • Map description
      • Feature selection tool
      • Search locations
      • Measure distances
      • Exporting data
      • Download PDF reports
      • Managing maps
      • Sharing and collaboration
        • Editor collaboration
        • Map preview for editors
        • Map settings for viewers
        • Comments
        • Embedding maps
        • URL parameters
      • Performance considerations
    • Workflows
      • Workflow canvas
      • Results panel
      • Components
        • Aggregation
        • Custom
        • Data Enrichment
        • Data Preparation
        • Generative AI
        • Input / Output
        • Joins
        • Parsers
        • Raster Operations
        • Spatial Accessors
        • Spatial Analysis
        • Spatial Constructors
        • Spatial Indexes
        • Spatial Operations
        • Statistics
        • Tileset Creation
        • BigQuery ML
        • Snowflake ML
        • Google Earth Engine
        • Google Environment APIs
        • Telco Signal Propagation Models
      • Data Sources
      • Scheduling workflows
      • Sharing workflows
      • Using variables in workflows
      • Executing workflows via API
      • Temporary data in Workflows
      • Extension Packages
      • Managing workflows
      • Workflows best practices
    • Data Explorer
      • Creating a map from your data
      • Importing data
        • Importing rasters
      • Geocoding data
      • Optimizing your data
    • Data Observatory
      • Terminology
      • Browsing the Spatial Data Catalog
      • Subscribing to public and premium datasets
      • Accessing free data samples
      • Managing your subscriptions
      • Accessing your subscriptions from your data warehouse
        • Access data in BigQuery
        • Access data in Snowflake
        • Access data in Databricks
        • Access data in Redshift
        • Access data in PostgreSQL
    • Connections
      • Google BigQuery
      • Snowflake
      • Databricks
      • Amazon Redshift
      • PostgreSQL
      • CARTO Data Warehouse
      • Sharing connections
      • Deleting a connection
      • Required permissions
      • IP whitelisting
      • Customer data responsibilities
    • Applications
    • Settings
      • Understanding your organization quotas
      • Activity Data
        • Activity Data Reference
        • Activity Data Examples
        • Activity Data Changelog
      • Users and Groups
        • Inviting users to your organization
        • Managing user roles
        • Deleting users
        • SSO
        • Groups
        • Mapping groups to user roles
      • CARTO Support Access
      • Customizations
        • Customizing appearance and branding
        • Configuring custom color palettes
        • Configuring your organization basemaps
        • Enabling AI Agents
      • Advanced Settings
        • Managing applications
        • Configuring S3 Bucket for Redshift Imports
        • Configuring OAuth connections to Snowflake
        • Configuring OAuth U2M connections to Databricks
        • Configuring S3 Bucket integration for RDS for PostgreSQL Exports in Builder
        • Configuring Workload Identity Federation for BigQuery
      • Data Observatory
      • Deleting your organization
    • Developers
      • Managing Credentials
        • API Base URL
        • API Access Tokens
        • SPA OAuth Clients
        • M2M OAuth Clients
      • Named Sources
  • Data and Analysis
    • Analytics Toolbox Overview
    • Analytics Toolbox for BigQuery
      • Getting access
        • Projects maintained by CARTO in different BigQuery regions
        • Manual installation in your own project
        • Installation in a Google Cloud VPC
        • Core module
      • Key concepts
        • Tilesets
        • Spatial indexes
      • SQL Reference
        • accessors
        • clustering
        • constructors
        • cpg
        • data
        • http_request
        • import
        • geohash
        • h3
        • lds
        • measurements
        • placekey
        • processing
        • quadbin
        • random
        • raster
        • retail
        • routing
        • s2
        • statistics
        • telco
        • tiler
        • transformations
      • Guides
        • Running queries from Builder
        • Working with Raster data
      • Release notes
      • About Analytics Toolbox regions
    • Analytics Toolbox for Snowflake
      • Getting access
        • Native App from Snowflake's Marketplace
        • Manual installation
      • Key concepts
        • Spatial indexes
        • Tilesets
      • SQL Reference
        • accessors
        • clustering
        • constructors
        • data
        • http_request
        • import
        • h3
        • lds
        • measurements
        • placekey
        • processing
        • quadbin
        • random
        • raster
        • retail
        • s2
        • statistics
        • tiler
        • transformations
      • Guides
        • Running queries from Builder
        • Working with Raster data
      • Release Notes
    • Analytics Toolbox for Databricks
      • Getting access
        • Personal (former Single User) cluster
        • Standard (former Shared) cluster
      • Reference
        • lds
        • tiler
      • Guides
      • Release Notes
    • Analytics Toolbox for Redshift
      • Getting access
        • Manual installation in your database
        • Installation in an Amazon Web Services VPC
        • Core version
      • Key concepts
        • Tilesets
        • Spatial indexes
      • SQL Reference
        • clustering
        • constructors
        • data
        • http_request
        • import
        • lds
        • placekey
        • processing
        • quadbin
        • random
        • s2
        • statistics
        • tiler
        • transformations
      • Guides
        • Running queries from Builder
      • Release Notes
    • Analytics Toolbox for PostgreSQL
      • Getting access
        • Manual installation
        • Core version
      • Key concepts
        • Tilesets
        • Spatial Indexes
      • SQL Reference
        • h3
        • quadbin
        • tiler
      • Guides
        • Creating spatial index tilesets
        • Running queries from Builder
      • Release Notes
    • CARTO + Python
      • Installation
      • Authentication Methods
      • Visualizing Data
      • Working with Data
        • How to work with your data in the CARTO Data Warehouse
        • How to access your Data Observatory subscriptions
        • How to access CARTO's Analytics Toolbox for BigQuery and create visualizations via Python notebooks
        • How to access CARTO’s Analytics Toolbox for Snowflake and create visualizations via Python notebooks
        • How to visualize data from Databricks
      • Reference
    • CARTO QGIS Plugin
  • CARTO for Developers
    • Overview
    • Key concepts
      • Architecture
      • Libraries and APIs
      • Authentication methods
        • API Access Tokens
        • OAuth Access Tokens
        • OAuth Clients
      • Connections
      • Data sources
      • Visualization with deck.gl
        • Basemaps
          • CARTO Basemap
          • Google Maps
            • Examples
              • Gallery
              • Getting Started
              • Basic Examples
                • Hello World
                • BigQuery Tileset Layer
                • Data Observatory Tileset Layer
              • Advanced Examples
                • Arc Layer
                • Extrusion
                • Trips Layer
            • What's New
          • Amazon Location
            • Examples
              • Hello World
              • CartoLayer
            • What's New
        • Rapid Map Prototyping
      • Charts and widgets
      • Filtering and interactivity
      • Summary
    • Quickstart
      • Make your first API call
      • Visualize your first dataset
      • Create your first widget
    • Guides
      • Build a public application
      • Build a private application
      • Build a private application using SSO
      • Visualize massive datasets
      • Integrate CARTO in your existing application
      • Use Boundaries in your application
      • Avoid exposing SQL queries with Named Sources
      • Managing cache in your CARTO applications
    • Reference
      • Deck (@deck.gl reference)
      • Data Sources
        • vectorTableSource
        • vectorQuerySource
        • vectorTilesetSource
        • h3TableSource
        • h3QuerySource
        • h3TilesetSource
        • quadbinTableSource
        • quadbinQuerySource
        • quadbinTilesetSource
        • rasterSource
        • boundaryTableSource
        • boundaryQuerySource
      • Layers (@deck.gl/carto)
      • Widgets
        • Data Sources
        • Server-side vs. client-side
        • Models
          • getFormula
          • getCategories
          • getHistogram
          • getRange
          • getScatter
          • getTimeSeries
          • getTable
      • Filters
        • Column filters
        • Spatial filters
      • CARTO APIs Reference
    • Release Notes
    • Examples
    • CARTO for React
      • Guides
        • Getting Started
        • Views
        • Data Sources
        • Layers
        • Widgets
        • Authentication and Authorization
        • Basemaps
        • Look and Feel
        • Query Parameters
        • Code Generator
        • Sample Applications
        • Deployment
        • Upgrade Guide
      • Examples
      • Library Reference
        • Introduction
        • API
        • Auth
        • Basemaps
        • Core
        • Redux
        • UI
        • Widgets
      • Release Notes
  • CARTO Self-Hosted
    • Overview
    • Key concepts
      • Architecture
      • Deployment requirements
    • Quickstarts
      • Single VM deployment (Kots)
      • Orchestrated container deployment (Kots)
      • Advanced Orchestrated container deployment (Helm)
    • Guides
      • Guides (Kots)
        • Configure your own buckets
        • Configure an external in-memory cache
        • Enable Google Basemaps
        • Enable the CARTO Data Warehouse
        • Configure an external proxy
        • Enable BigQuery OAuth connections
        • Configure Single Sign-On (SSO)
        • Use Workload Identity in GCP
        • High availability configuration for CARTO Self-hosted
        • Configure your custom service account
      • Guides (Helm)
        • Configure your own buckets (Helm)
        • Configure an external in-memory cache (Helm)
        • Enable Google Basemaps (Helm)
        • Enable the CARTO Data Warehouse (Helm)
        • Configure an external proxy (Helm)
        • Enable BigQuery OAuth connections (Helm)
        • Configure Single Sign-On (SSO) (Helm)
        • Use Workload Identity in GCP (Helm)
        • Use EKS Pod Identity in AWS (Helm)
        • Enable Redshift imports (Helm)
        • Migrating CARTO Self-hosted installation to an external database (Helm)
        • Advanced customizations (Helm)
        • Configure your custom service account (Helm)
    • Maintenance
      • Maintenance (Kots)
        • Updates
        • Backups
        • Uninstall
        • Rotating keys
        • Monitoring
        • Change the Admin Console password
      • Maintenance (Helm)
        • Monitoring (Helm)
        • Rotating keys (Helm)
        • Uninstall (Helm)
        • Backups (Helm)
        • Updates (Helm)
    • Support
      • Get debug information for Support (Kots)
      • Get debug information for Support (Helm)
    • CARTO Self-hosted Legacy
      • Key concepts
        • Architecture
        • Deployment requirements
      • Quickstarts
        • Single VM deployment (docker-compose)
      • Guides
        • Configure your own buckets
        • Configure an external in-memory cache
        • Enable Google Basemaps
        • Enable the CARTO Data Warehouse
        • Configure an external proxy
        • Enable BigQuery OAuth connections
        • Configure Single Sign-On (SSO)
        • Enable Redshift imports
        • Configure your custom service account
        • Advanced customizations
        • Migrating CARTO Self-Hosted installation to an external database
      • Maintenance
        • Updates
        • Backups
        • Uninstall
        • Rotating keys
        • Monitoring
      • Support
    • Release Notes
  • CARTO Native App for Snowflake Containers
    • Deploying CARTO using Snowflake Container Services
  • Get Help
    • Legal & Compliance
    • Previous libraries and components
    • Migrating your content to the new CARTO platform
Powered by GitBook
On this page
  • Considerations
  • Accessing your new CARTO organization
  • Migrating data
  • Re-creating maps
  • Migrating applications
  • Conclusions
  • FAQs

Was this helpful?

Export as PDF
  1. Get Help

Migrating your content to the new CARTO platform

PreviousPrevious libraries and components

Last updated 3 months ago

Was this helpful?

In 2021 we launched a new version of the CARTO platform, designed to connect seamlessly with leading cloud data warehouses. The new CARTO Cloud-native platform delivers unparalleled performance, scalability, and innovative features such as CARTO Workflows or AI Agents, plus an enhanced CARTO Builder experience and improved developer tools.

We encourage all CARTO users to explore and use the new platform, as the Legacy version will progressively be retired to the general public, with due notice.

This guide covers how to migrate your content (data, maps, and applications) from legacy accounts to the new version of the CARTO platform.

Considerations

How do I know if I'm using a Legacy version of CARTO?
  • For the Legacy version of CARTO, the url will typically be something like my_username.carto.com. It can be accessed via .

  • When using the new CARTO platform the url will be app.carto.com, and it can be accessed via .

If you already access CARTO through and all your content is there, no need to worry: you’re already using the new platform and no migration is required.

I’m using the Legacy version, what content do I need to migrate and why?

Technically speaking, there are many substantial differences between the legacy platform and the new cloud-native platform.

For example, CARTO Legacy uses a fully managed PostgreSQL database where you had to upload data in order to use it, while CARTO Cloud-native uses a live connection to your preferred data warehouse: , , , or .

Another example is the mapping engine: our legacy platform uses Leaflet, where the new CARTO platform uses modern vector techniques over .

Consequently, the data and content generated in the Legacy platform needs to be migrated manually, typically following these steps:

  • Migrating data: export your data from the legacy CARTO PostgreSQL DB into your desired data warehouse. .

  • Re-creating maps: re-create the necessary maps in the new platform, using the new techniques and features in the new CARTO. .

  • Migrating applications: migrate your existing applications code so that they use deck.gl and the new CARTO APIs. .

What if I don’t have a data warehouse of my own?

No problem! CARTO provides a built-in data warehouse for every organization, no setup required. The included is highly scalable and performant, but it will be a less flexible solution than using your own data warehouse. In the long run, you should consider using your own data warehouse.

Accessing your new CARTO organization

To start your migration process, let’s make sure you have a valid organization in the new CARTO. The legacy and the new platform use totally different login credentials, so you might need to sign up first.

Migrating data

There are several methods to migrate your data, and the best method for you will depend on your code expertise, the amount of data in your CARTO legacy account, and the data warehouse where you plan to migrate it to.

Method 1: UI-based export and import

For small amounts of data, the recommended way is to simply use the User Interface (UI) of both platforms to export and import your data:

Method 2: Connect directly to your legacy CARTO DB and import using your own data warehouse

  1. Importing to your data warehouse: Use your preferred library to copy this data into your data warehouse.

    1. BigQuery: Convert the backup to CSV/JSON using pg_restore or a similar tool, then load it into BigQuery via the web UI, bq CLI, or Data Transfer Service. Ensure geometry columns are converted to WKT format.

    2. Snowflake: Export the data to CSV using pg_restore or a similar tool and stage it in an S3 bucket or Snowflake Stage. Use the COPY INTO command to load the data.

    3. Redshift: Restore the dump to a PostgreSQL instance, then use AWS DMS or COPY commands with CSV files staged in S3 to load data into Redshift.

    4. Databricks: Export the data to CSV/Parquet, upload it to a cloud storage location (e.g., S3 or Azure Blob), and use Databricks' spark.read to ingest it. Geometry columns may need transformation to WKT/WKB.

Method 3: Connect directly to your legacy CARTO DB and import using the CARTO Imports API

  1. Upload it to an object storage bucket (S3, GCS, Azure Blob…): Use your preferred bucket system and expose the files using public or signed URLs.

Example using the Imports API
var myHeaders = new Headers();
myHeaders.append("Content-Type", "application/json");
myHeaders.append("Authorization", "your-API-access-token");


var raw = JSON.stringify({
  "connection": "your-connection-name",
  "url": "https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/1005861/2b.__FC_-_Over_25k_-_June_21.csv",
  "destination": "project.dataset.table",
  "overwrite": false,
  "autoguessing": true,
  "schema": {
    "columns": [
      {
        "name": "geom",
        "type": "GEOGRAPHY",
        "nullable": true
      },
      {
        "name": "cartodb_id",
        "type": "INT64",
        "nullable": true
      },
      {
        "name": "lat",
        "type": "FLOAT64",
        "nullable": true
      },
      {
        "name": "lon",
        "type": "FLOAT64",
        "nullable": true
      },
      {
        "name": "pais",
        "type": "STRING",
        "nullable": true
      },
      {
        "name": "pais_long",
        "type": "STRING",
        "nullable": true
      },
      {
        "name": "iso3",
        "type": "STRING",
        "nullable": true
      },
      {
        "name": "continente",
        "type": "STRING",
        "nullable": true
      },
      {
        "name": "subregion",
        "type": "STRING",
        "nullable": true
      },
      {
        "name": "iso2",
        "type": "STRING",
        "nullable": true
      },
      {
        "name": "capital",
        "type": "STRING",
        "nullable": true
      },
      {
        "name": "iso_n3",
        "type": "STRING",
        "nullable": true
      },
      {
        "name": "poblacion",
        "type": "STRING",
        "nullable": true
      }
    ],
    "options": [
      {
        "name": "STRING"
      },
      {
        "name": "BYTES"
      },
      {
        "name": "INT64"
      },
      {
        "name": "NUMERIC"
      },
      {
        "name": "BIGNUMERIC"
      },
      {
        "name": "FLOAT64"
      },
      {
        "name": "BOOL"
      },
      {
        "name": "DATE"
      },
      {
        "name": "DATETIME"
      },
      {
        "name": "TIME"
      },
      {
        "name": "TIMESTAMP"
      },
      {
        "name": "JSON"
      },
      {
        "name": "GEOGRAPHY"
      }
    ]
  }
});


var requestOptions = {
  method: 'POST',
  headers: myHeaders,
  body: raw,
  redirect: 'follow'
};


fetch("https://gcp-us-east1.api.carto.com/v3/imports", requestOptions)
  .then(response => response.text())
  .then(result => console.log(result))
  .catch(error => console.log('error', error));

Please note that the Imports API is not yet available for Databricks connections.

Re-creating maps

Once your data is in the new CARTO you can now re-create your maps using the new CARTO Builder. This will be a manual process and can’t be automated. Here are some tips and considerations to guide you through the process of building your new maps:

  1. Performance for smaller amounts of data can be similar, but the new CARTO is definitely more powerful for larger volumes of data.

  2. The new maps do not need to match exactly the old ones. Consider using new functionality such as SQL Parameters, split view, 3D views, and more. Likewise, some features in the legacy CARTO might not match exactly the features in the new platform, and you might need to choose different solutions.

Migrating applications

This migration can bring a positive impact: you’ll have access to more modern tools, built on top of your data warehouse (potentially removing components and ETL processes), and designed for security and scalability.

When facing an application migration project, these are the steps we recommend taking:

  1. Get familiar with the CARTO APIs: Explore our new APIs to unlock additional functionality such as imports or geocoding.

  2. Get familiar with authentication mechanisms in the new CARTO: From public applications to large-scale integrations using SSO, the new CARTO offers a flexible set of authentication strategies for developers.

  3. Make a list of the functionalities and technologies in your current application that use CARTO: in order to design and scope the migration effort, review the features that are currently using CARTO, along with the technologies used.

  4. Choose a replacement for each of the functionalities/technologies: while this should be done on a case-by-case basis, this table provides general guidelines you can follow:

Functionality

Legacy Technology

Replace with…

Database

CARTODB

Your own cloud data warehouse (BigQuery, Snowflake, Redshift, Databricks or PostgreSQL), connected using CARTO

Javascript-based rendering maps, layers, symbols, popups, tooltips, etc…

CartoJS CartoVL

Styling data-driven visualizations

CartoCSS

SQL API for querying data

CARTO SQL API

CARTO SQL API

Python notebooks

CARTOframes

CARTO + Python

Use maps from CARTO Builder into a custom application

Named maps, viz.json

Build applications in CARTO without exposing SQL client-side

Named maps

  1. Migrate your codebase: follow the plan and take the opportunity to achieve even better performance, add new features, and load even more types of data in your application, such as raster or spatial indexes.

Each migration project will be a unique process. If you have questions or need guidance, our support team (support@carto.com) will be happy to help you.

Conclusions

The new CARTO platform is an important leap into the future of cloud-native geospatial analysis. Content from previous legacy versions needs to be migrated manually due to the remarkable differences in the technical approach for each version. During the migration, you will be in charge of defining how your new maps, workflows and applications will look like in the new platform.

If you’ve already migrated your data, maps, and applications

Then… Congrats! You’re fully migrated and ready to experience the new CARTO.

FAQs

Can I start using the new CARTO without migrating?

Totally! Migrating is optional, in case you want to preserve your old datasets, maps, and applications by bringing them into the new platform.

Moreover, our recommendation is that you start using the new CARTO already for all upcoming projects, even if the migration isn’t fully finished.

Why can’t CARTO do this migration automatically?

The new CARTO and the legacy versions have different technical approaches, user interfaces, and feature sets. That is why during the migration process you will need to make a considerable amount of decisions in order to define how your data, maps and applications will exist in the new CARTO. We can’t make those decisions for you and therefore we can’t migrate them automatically.

If you already have an organization in the new CARTO, just login via

If you don’t have an organization in the new CARTO, create a new 14-day trial via . Then, email your CARTO representative or our Support team (support@carto.com) with your details and our team will complete the setup for your new organization as needed.

You should now see the — we’re ready!

: you can use any of these formats: CSV, SHP, KML, GeoJSON and GeoPackage.

: use the new Import functionality to upload the legacy files into the desired data warehouse connection.

If you’re comfortable with data warehouse ETLs and you have larger volumes of data, you can create an to access and create a backup copy of your data:

Exporting your backup: Use to create a SQL file containing the data from your DB. Alternatively, you can also use to export the data using Python.

PostgreSQL: Use with the --dbname flag to directly import the dump into another PostgreSQL database.

Lastly, for those with larger volumes of data that are less comfortable with data warehouse ETLs (or are using the included CARTO Data Warehouse), you can create an to access and create a backup copy of your data, and then import it using our Imports API:

Exporting your backup: Use to create a SQL file containing the data from your DB. Alternatively, you can also use to export the data using Python.

Convert it to CSV: Use (for SQL files) or a script (for Python exports) to convert your backup into CSV files.

Using the Imports API to import your backup: The is a rest API that allows you to import data from a URL into your own data warehouse, leveraging your CARTO connection. It performs the necessary checks and optimizations for visualization and analysis using CARTO. This is an example of an Import job using our Imports API:

Get familiar with the new CARTO Builder using our and the .

Use the brand new if you want to edit or transform your data.

If you were using custom basemaps, you’ll need to first register them in

If you re-use color palettes across multiple maps, you can store them in

Once you’re comfortable with your new map, make sure to review the title and description, and share it with your team or with the world! And don’t forget — If you hit a roadblock in the map re-building process, our team () will be there to help you.

If you were running custom applications using our legacy tools, you’ll need to migrate them to use our new stack. , based on our new REST APIs and deck.gl, a modern WebGL/WebGPU-based JavaScript library to visualize data.

Get familiar with : Explore our documentation and examples to understand how applications are built in the new platform.

in CARTO + deck.gl

app.carto.com
app.carto.com/signup
CARTO Workspace
Download your data from the legacy CARTO platform
Upload your data to CARTO 3
external connection with your CARTO 2 DB
pg_dump
CARTOframes
pg_restore
external connection with your CARTO 2 DB
pg_dump
CARTOframes
pg_restore
Imports API
documentation
tutorials in the CARTO Academy
CARTO Workflows
Settings > Customizations > Basemaps
Settings > Customizations > Palettes
support@carto.com
The new CARTO offers a complete set of tools for developers
CARTO + deck.gl
https://carto.com/login
https://app.carto.com
app.carto.com
BigQuery
Snowflake
Redshift
Databricks
PostgreSQL
deck.gl
CARTO Data Warehouse
Learn more about Migrating data
Learn more about Re-creating maps
Learn more about Migrating applications
CARTO + deck.gl
CARTO + deck.gl
fetchMap
Named sources