LogoLogo
HomeAcademyLoginTry for free
  • Welcome
  • What's new
    • Q2 2025
    • Q1 2025
    • Q4 2024
    • Q3 2024
    • Q2 2024
    • Q1 2024
    • Q4 2023
    • Q3 2023
    • Q2 2023
    • Q1 2023
    • Q4 2022
    • Q3 2022
  • FAQs
    • Accounts
    • Migration to the new platform
    • User & organization setup
    • General
    • Builder
    • Workflows
    • Data Observatory
    • Analytics Toolbox
    • Development Tools
    • Deployment Options
    • CARTO Basemaps
    • CARTO for Education
    • Support Packages
    • Security and Compliance
  • Getting started
    • What is CARTO?
    • Quickstart guides
      • Connecting to your data
      • Creating your first map
      • Creating your first workflow
      • Developing your first application
    • CARTO Academy
  • CARTO User Manual
    • Overview
      • Creating your CARTO organization
      • CARTO Cloud Regions
      • CARTO Workspace overview
    • Maps
      • Data sources
        • Simple features
        • Spatial Indexes
        • Pre-generated tilesets
        • Rasters
        • Defining source spatial data
        • Managing data freshness
        • Changing data source location
      • Layers
        • Point
          • Grid point aggregation
          • H3 point aggregation
          • Heatmap point aggregation
          • Cluster point aggregation
        • Polygon
        • Line
        • Grid
        • H3
        • Raster
        • Zoom to layer
      • Widgets
        • Formula widget
        • Category widget
        • Pie widget
        • Histogram widget
        • Range widget
        • Time Series widget
        • Table widget
      • SQL Parameters
        • Date parameter
        • Text parameter
        • Numeric parameter
        • Publishing SQL parameters
      • Interactions
      • Legend
      • Basemaps
        • Basemap selector
      • AI Agents
      • SQL analyses
      • Map view modes
      • Map description
      • Feature selection tool
      • Search locations
      • Measure distances
      • Exporting data
      • Download PDF reports
      • Managing maps
      • Sharing and collaboration
        • Editor collaboration
        • Map preview for editors
        • Map settings for viewers
        • Comments
        • Embedding maps
        • URL parameters
      • Performance considerations
    • Workflows
      • Workflow canvas
      • Results panel
      • Components
        • Aggregation
        • Custom
        • Data Enrichment
        • Data Preparation
        • Generative AI
        • Input / Output
        • Joins
        • Parsers
        • Raster Operations
        • Spatial Accessors
        • Spatial Analysis
        • Spatial Constructors
        • Spatial Indexes
        • Spatial Operations
        • Statistics
        • Tileset Creation
        • BigQuery ML
        • Snowflake ML
        • Google Earth Engine
        • Google Environment APIs
        • Telco Signal Propagation Models
      • Data Sources
      • Scheduling workflows
      • Sharing workflows
      • Using variables in workflows
      • Executing workflows via API
      • Temporary data in Workflows
      • Extension Packages
      • Managing workflows
      • Workflows best practices
    • Data Explorer
      • Creating a map from your data
      • Importing data
        • Importing rasters
      • Geocoding data
      • Optimizing your data
    • Data Observatory
      • Terminology
      • Browsing the Spatial Data Catalog
      • Subscribing to public and premium datasets
      • Accessing free data samples
      • Managing your subscriptions
      • Accessing your subscriptions from your data warehouse
        • Access data in BigQuery
        • Access data in Snowflake
        • Access data in Databricks
        • Access data in Redshift
        • Access data in PostgreSQL
    • Connections
      • Google BigQuery
      • Snowflake
      • Databricks
      • Amazon Redshift
      • PostgreSQL
      • CARTO Data Warehouse
      • Sharing connections
      • Deleting a connection
      • Required permissions
      • IP whitelisting
      • Customer data responsibilities
    • Applications
    • Settings
      • Understanding your organization quotas
      • Activity Data
        • Activity Data Reference
        • Activity Data Examples
        • Activity Data Changelog
      • Users and Groups
        • Inviting users to your organization
        • Managing user roles
        • Deleting users
        • SSO
        • Groups
        • Mapping groups to user roles
      • CARTO Support Access
      • Customizations
        • Customizing appearance and branding
        • Configuring custom color palettes
        • Configuring your organization basemaps
        • Enabling AI Agents
      • Advanced Settings
        • Managing applications
        • Configuring S3 Bucket for Redshift Imports
        • Configuring OAuth connections to Snowflake
        • Configuring OAuth U2M connections to Databricks
        • Configuring S3 Bucket integration for RDS for PostgreSQL Exports in Builder
        • Configuring Workload Identity Federation for BigQuery
      • Data Observatory
      • Deleting your organization
    • Developers
      • Managing Credentials
        • API Base URL
        • API Access Tokens
        • SPA OAuth Clients
        • M2M OAuth Clients
      • Named Sources
  • Data and Analysis
    • Analytics Toolbox Overview
    • Analytics Toolbox for BigQuery
      • Getting access
        • Projects maintained by CARTO in different BigQuery regions
        • Manual installation in your own project
        • Installation in a Google Cloud VPC
        • Core module
      • Key concepts
        • Tilesets
        • Spatial indexes
      • SQL Reference
        • accessors
        • clustering
        • constructors
        • cpg
        • data
        • http_request
        • import
        • geohash
        • h3
        • lds
        • measurements
        • placekey
        • processing
        • quadbin
        • random
        • raster
        • retail
        • routing
        • s2
        • statistics
        • telco
        • tiler
        • transformations
      • Guides
        • Running queries from Builder
        • Working with Raster data
      • Release notes
      • About Analytics Toolbox regions
    • Analytics Toolbox for Snowflake
      • Getting access
        • Native App from Snowflake's Marketplace
        • Manual installation
      • Key concepts
        • Spatial indexes
        • Tilesets
      • SQL Reference
        • accessors
        • clustering
        • constructors
        • data
        • http_request
        • import
        • h3
        • lds
        • measurements
        • placekey
        • processing
        • quadbin
        • random
        • raster
        • retail
        • s2
        • statistics
        • tiler
        • transformations
      • Guides
        • Running queries from Builder
        • Working with Raster data
      • Release Notes
    • Analytics Toolbox for Databricks
      • Getting access
        • Personal (former Single User) cluster
        • Standard (former Shared) cluster
      • Reference
        • lds
        • tiler
      • Guides
      • Release Notes
    • Analytics Toolbox for Redshift
      • Getting access
        • Manual installation in your database
        • Installation in an Amazon Web Services VPC
        • Core version
      • Key concepts
        • Tilesets
        • Spatial indexes
      • SQL Reference
        • clustering
        • constructors
        • data
        • http_request
        • import
        • lds
        • placekey
        • processing
        • quadbin
        • random
        • s2
        • statistics
        • tiler
        • transformations
      • Guides
        • Running queries from Builder
      • Release Notes
    • Analytics Toolbox for PostgreSQL
      • Getting access
        • Manual installation
        • Core version
      • Key concepts
        • Tilesets
        • Spatial Indexes
      • SQL Reference
        • h3
        • quadbin
        • tiler
      • Guides
        • Creating spatial index tilesets
        • Running queries from Builder
      • Release Notes
    • CARTO + Python
      • Installation
      • Authentication Methods
      • Visualizing Data
      • Working with Data
        • How to work with your data in the CARTO Data Warehouse
        • How to access your Data Observatory subscriptions
        • How to access CARTO's Analytics Toolbox for BigQuery and create visualizations via Python notebooks
        • How to access CARTO’s Analytics Toolbox for Snowflake and create visualizations via Python notebooks
        • How to visualize data from Databricks
      • Reference
    • CARTO QGIS Plugin
  • CARTO for Developers
    • Overview
    • Key concepts
      • Architecture
      • Libraries and APIs
      • Authentication methods
        • API Access Tokens
        • OAuth Access Tokens
        • OAuth Clients
      • Connections
      • Data sources
      • Visualization with deck.gl
        • Basemaps
          • CARTO Basemap
          • Google Maps
            • Examples
              • Gallery
              • Getting Started
              • Basic Examples
                • Hello World
                • BigQuery Tileset Layer
                • Data Observatory Tileset Layer
              • Advanced Examples
                • Arc Layer
                • Extrusion
                • Trips Layer
            • What's New
          • Amazon Location
            • Examples
              • Hello World
              • CartoLayer
            • What's New
      • Charts and widgets
      • Filtering and interactivity
      • Integrating Builder maps in your application
      • Summary
    • Quickstart
      • Make your first API call
      • Visualize your first dataset
      • Create your first widget
    • Guides
      • Build a public application
      • Build a private application
      • Build a private application using SSO
      • Visualize massive datasets
      • Integrate CARTO in your existing application
      • Use Boundaries in your application
      • Avoid exposing SQL queries with Named Sources
      • Managing cache in your CARTO applications
    • Reference
      • Deck (@deck.gl reference)
      • Data Sources
        • vectorTableSource
        • vectorQuerySource
        • vectorTilesetSource
        • h3TableSource
        • h3QuerySource
        • h3TilesetSource
        • quadbinTableSource
        • quadbinQuerySource
        • quadbinTilesetSource
        • rasterSource
        • boundaryTableSource
        • boundaryQuerySource
      • Layers (@deck.gl/carto)
      • Widgets
        • Data Sources
        • Server-side vs. client-side
        • Models
          • getFormula
          • getCategories
          • getHistogram
          • getRange
          • getScatter
          • getTimeSeries
          • getTable
      • Filters
        • Column filters
        • Spatial filters
      • fetchMap
      • CARTO APIs Reference
    • Release Notes
    • Examples
    • CARTO for React
      • Guides
        • Getting Started
        • Views
        • Data Sources
        • Layers
        • Widgets
        • Authentication and Authorization
        • Basemaps
        • Look and Feel
        • Query Parameters
        • Code Generator
        • Sample Applications
        • Deployment
        • Upgrade Guide
      • Examples
      • Library Reference
        • Introduction
        • API
        • Auth
        • Basemaps
        • Core
        • Redux
        • UI
        • Widgets
      • Release Notes
  • CARTO Self-Hosted
    • Overview
    • Key concepts
      • Architecture
      • Deployment requirements
    • Quickstarts
      • Single VM deployment (Kots)
      • Orchestrated container deployment (Kots)
      • Advanced Orchestrated container deployment (Helm)
    • Guides
      • Guides (Kots)
        • Configure your own buckets
        • Configure an external in-memory cache
        • Enable Google Basemaps
        • Enable the CARTO Data Warehouse
        • Configure an external proxy
        • Enable BigQuery OAuth connections
        • Configure Single Sign-On (SSO)
        • Use Workload Identity in GCP
        • High availability configuration for CARTO Self-hosted
        • Configure your custom service account
      • Guides (Helm)
        • Configure your own buckets (Helm)
        • Configure an external in-memory cache (Helm)
        • Enable Google Basemaps (Helm)
        • Enable the CARTO Data Warehouse (Helm)
        • Configure an external proxy (Helm)
        • Enable BigQuery OAuth connections (Helm)
        • Configure Single Sign-On (SSO) (Helm)
        • Use Workload Identity in GCP (Helm)
        • Use EKS Pod Identity in AWS (Helm)
        • Enable Redshift imports (Helm)
        • Migrating CARTO Self-hosted installation to an external database (Helm)
        • Advanced customizations (Helm)
        • Configure your custom service account (Helm)
    • Maintenance
      • Maintenance (Kots)
        • Updates
        • Backups
        • Uninstall
        • Rotating keys
        • Monitoring
        • Change the Admin Console password
      • Maintenance (Helm)
        • Monitoring (Helm)
        • Rotating keys (Helm)
        • Uninstall (Helm)
        • Backups (Helm)
        • Updates (Helm)
    • Support
      • Get debug information for Support (Kots)
      • Get debug information for Support (Helm)
    • CARTO Self-hosted Legacy
      • Key concepts
        • Architecture
        • Deployment requirements
      • Quickstarts
        • Single VM deployment (docker-compose)
      • Guides
        • Configure your own buckets
        • Configure an external in-memory cache
        • Enable Google Basemaps
        • Enable the CARTO Data Warehouse
        • Configure an external proxy
        • Enable BigQuery OAuth connections
        • Configure Single Sign-On (SSO)
        • Enable Redshift imports
        • Configure your custom service account
        • Advanced customizations
        • Migrating CARTO Self-Hosted installation to an external database
      • Maintenance
        • Updates
        • Backups
        • Uninstall
        • Rotating keys
        • Monitoring
      • Support
    • Release Notes
  • CARTO Native App for Snowflake Containers
    • Deploying CARTO using Snowflake Container Services
  • Get Help
    • Legal & Compliance
    • Previous libraries and components
    • Migrating your content to the new CARTO platform
Powered by GitBook
On this page
  • Export to Bucket
  • Create Builder Map
  • Get Table by Name
  • Import from URL
  • HTTP Request
  • Output
  • Save as Table
  • Send by Email

Was this helpful?

Export as PDF
  1. CARTO User Manual
  2. Workflows
  3. Components

Input / Output

PreviousGenerative AINextJoins

Last updated 18 days ago

Was this helpful?

Components that allow you to import or export data to/from CARTO and your connected cloud data warehouse.

Export to Bucket

Description

This components exports a table to a storage bucket.

For Self-Hosted deployments, a bucket owned by the customer needs to be configured. Please refer to for more information.

Inputs

  • Source table

Settings

  • File format: Select between CSV and JSON formats for the exported file(s).

  • Export compressed: Choose between a compressed file (.gz) or uncomprsesed.

  • GCS Bucket name: A bucket location provided by the user will be used by entering its name. Optionally a path can be added to the bucket name separated by slashes (/). The name of the bucket should follow the and the name of the folder(s), in addition to should not contain single quotes (') or backslashes (). The connection (the email which appears on the connections page) should at least have permission for creating and deleting objects. To download the urls in a browser they should grant permissions to the email of an account they can login with. This custom bucket won't work with CARTO Data Warehouse connections unless the bucket is public.

Using this component requires that the permission in the billing project.

Outputs

  • Result table: A table containing links to the result of the export.

Create Builder Map

Description

This component creates a Builder map on its first execution. On subsequent executions, it can update, overwrite or create a copy of the existing map.

Inputs

If you need to create permanent data sources in your map, use the "Save as Table" component and connect its output to the "Create Builder Map" component. The result will be a standard, persistent data source in your Builder map.

Settings

  • Re-running mode: This setting controls the behaviour of the component after the first execution. It has three different modes:

    • Update (default): In this mode, executing the workflow updates data sources in the map, syncing them with the currently connected input nodes while keeping map styles, widgets and other map settings.

    • Overwrite: In this mode, executing the workflow deletes the current map and creates a new one from scratch with default settings. The new map will have a different ID and URL.

    • Create a copy: In this mode, executing the workflow creates a new copy of the map with same styles and other settings. The copy will reflect the currently connected input nodes.

Output

  • This component generates a single-row table with three columns that contain the map ID, the map URL and a timestamp of the last execution.

Get Table by Name

Description

Load a table in the workflow, giving its name as a string. eg: db.schema.table.

Inputs

  • Source table reference: A string that contains the FQN of the table that will be loaded to the canvas.

Output

  • Result table

Import from URL

Description

This components imports a table from a URL.

This component needs to be executed (by running the workflow) before being able to connect . This process may take several minutes.

Inputs

  • Source URL: The public URL to be used to get the file that will be imported.

    • Check the "Automatically guess column data types in the imported table" checkbox to automatically guess data types on the input file. If disabled, all properties will be imported as a String.

Outputs

  • Output table: The component generates a table that contains the result of importing the file from "Source URL".

HTTP Request

Description

This component perform requests to external endpoints.

Inputs

  • Source table (Optional): If the component has an input table, columns from this input can be used within expressions in other settings.

Take into account that this component doesn't accept more than 10 rows as input. This limit is in place to ensure stability.

Settings

  • URL : The URL that will be used to perform the HTTP request

    • Example: { "method":"POST", "headers":{ "Content-Type":"application/json" }, "mode":"cors", "cache":"default", "body": "{ ...}" }

Both URL and Options settings allow the usage of expressions and variables:

  • This is an example of using values from an input's column in the URL, concatenating strings with values from columns, all within an expression enclosed with double curly braces: {{'https://your-api-domain.com/coordinates_endpoint?lat=' || latitude || '&lon=' || longitude || '&otherparams=...'}}

  • Using expressions in the Options setting is a bit different, since the content of it is stringified on the data warehouse, and we can use column names between braces directly instead of concatenating strings, for example: { "method":"POST", "headers":{ "Content-Type":"application/json" }, "mode":"cors", "cache":"default", "body": "{\"lng\": {{longitude_column}}, \"lat\":{{latitude_column}} }" }

In the examples above, values from latitude and longitude columns have been used with expressions to generate a different URL /options for each row in the input table.

  • Allowed hosts (Optional): When expressions or variables are used in the URL, you need to set a comma-separated list of allowed hosts for your requests. If expressions/variables are not used in URL, this setting is ignored.

Warning: API Key Security Including an API key or an authorization header in the URL or options object may expose them in data warehouse logs. Handle keys responsibly to prevent unauthorized access.

Outputs

  • Output table: This components produces a table that contains all columns from the input table (if any was connected) plus a column response_data containing a string with the response's data.

External links

This component makes use of the functions in the http_request module of the CARTO Analytics Toolbox. Check the links below for more specific documentation:

Output

Description

This component sets the node connected to it as output for API executions of the workflow.

Inputs

  • Source table

This component can only be used once on a workflow. The content of the node connected to it will be stored on a temporary table, specified in the API response when calling the execution of a workflow as: "workflowOutputTableName": "workflows-api-demo.workflows_temp.wfproc_f2f8df5df4ddf279_out_33afd785675f081d".

Save as Table

Description

This component writes an input table to a non-temporary location.

If a table exists under that location, it will be overwritten.

Inputs

  • Source table

Settings

  • FQN of table to create/overwrite. There are two different options:

    • Use the UI to select a destination for the target table in the current connection.

    • Type a FQN or an expression that uses variables or concatenates strings and variables, like: {{@variable_fqn}} or {{'database.schema.result_table_' || @variable_name}}

  • Append [true/false]: Determines whether the result will be appended to an existing table; or it will overwritten.

Provided FQNs need to follow convention on each data warehouse. Invalid names (according to each data warehouse constraints) will make the component fail. Check the documentation for each.

Outputs

  • The output of this component references the table created in the defined location.

Send by Email

This component is not available in Self Hosted installations of the CARTO platform

Description

This component sends an email with a custom subject and body. Optionally, it can export data to a bucket and add the resulting link(s) to the email's body.

Inputs

  • Source table [Table]

Settings

  • Email address: The email address that will receive the email. This field support multiple addresses as separate recipients for the email.

  • Subject: The subject of the email.

  • Body: The content of the email.

  • Include data: If enabled, data from the input table will be exported to a bucket and resulting link(s) will be added to the email's body.

  • File format:

    • CSV

    • JSON

Outputs

  • This component doesn't have an output that can be connected to another node.

This component can receive any number of source table inputs. Each connected node will result in a datasource (with its corresponding layer) in the created Builder map. These datasources will be tagged as "Temporal" in Builder, since they might be subject to an expiration date or live in a temporal dataset/schema ().

Map name: A string that defines the name of the created Builder map. Usage of expressions and is allowed to generate a name by concatenating strings and variables. For example: {{'Map of the state of ' || @state_name}}

Options (Optional): JSON object with options for the HTTP Request. You can use this to set a different method than GET, specify headers or a body for your request. Follows the .

Find more information about how to define the output of a workflow .

This component requires installed in the chosen connection to build the workflow.

GCS Bucket name (Optional). If provided the data will be exported to a custom GCS bucket. Optionally a path can be added to the bucket name separated by slashes (/). The name of the bucket should follow the and the name of the folder(s), in addition to should not contain single quotes (') or backslashes (). The connection (the email which appears on the connections page) should at least have permission for creating and deleting objects. To download the urls in a browser they should grant permissions to the email of an account they can login with. This custom bucket won't work with CARTO Data Warehouse connections unless the bucket is public.

this documentation
GCS specification
GCS requirements
connection has BigQuery User
more info
variables
fetch() API options specification
BigQuery
Snowflake
Redshift
here
BigQuery
Snowflake
Databricks
Redshift
PostgreSQL
the CARTO Analytics Toolbox
GCS specification
GCS requirements