Input / Output
Components that allow you to import or export data to/from CARTO and your connected cloud data warehouse.
Export to Bucket
Description
This components exports a table to a storage bucket. Unless a GCS Bucket Name is provided, the files will be exported to a bucket owned by CARTO. The read access to this bucket is public, but listing is forbidden. This means that a file can only be downloaded by having the link, which name is not guessable.
For Self-Hosted deployments, a bucket owned by the customer needs to be configured. Please refer to this documentation for more information.
Inputs
Source table [Table]
File format: Select between CSV and JSON formats for the exported file(s).
Export compressed: Choose between a compressed file (.gz) or uncomprsesed.
(Optional) GCS Bucket name: A custom bucket provided by the user can be used by entering its name. Optionally a folder (or folders) can be added to the bucket name separated by slashes (/). The name of the bucket should follow the GCS specification and the name of the folder(s), in addition to GCS requirements should not contain single quotes (') or backslashes (). The connection (the email which appears on the connections page) should at least have permission for creating and deleting objects. To download the urls in a browser they should grant permissions to the email of an account they can login with. This custom bucket won't work with CARTO Data Warehouse connections unless the bucket is public.
Using this component requires that the connection has BigQuery User permission in the billing project.
Outputs
Result table [Table]
A table containing links to the result of the export.
Get Table by Name
Description
Load a table in the workflow, giving its name as a string. eg: db.schema.table.
Inputs
Source table reference [String]
Output
Result table [Table]
Import from URL
Description
This components imports a table from a URL.
This component needs to be executed (by runnning the workflow) before being able to connect . This process may take several minutes.
Inputs
Source URL [String]
: The public URL to be used to get the file that will be imported.Check the "Automatically guess column data types in the imported table" checkbox to automatically guess data types on the input file. If disabled, all properties will be imported as a String.
Outputs
Output table [Table]
: The component generates a table that contains the result of importing the file from "Source URL".
HTTP Request
Description
This component perform requests to external endpoints.
Inputs
Source table [Table] (Optional)
: If the component has an input table, columns from this input can be used within expressions in other settings.
Take into account that this component doesn't accept more than 10 rows as input. This limit is in place to ensure stability.
URL [String]
: The URL that will be used to perform the HTTP requestOptions [String] (Optional)
: JSON object with options for the HTTP Request. You can use this to set a different method than GET, specify headers or a body for your request. Follows thefetch()
API options specification.Example:
{ "method":"POST", "headers":{ "Content-Type":"application/json" }, "mode":"cors", "cache":"default", "body": "{ ...}" }
Both URL and Options settings allow the usage of expressions and variables:
This is an example of using values from an input's column in the URL, concatenating strings with values from columns, all within an expression enclosed with double curly braces:
{{
'https://your-api-domain.com/coordinates_endpoint?lat=' ||
latitude
|| '&lon=' ||
longitude ||
'&otherparams=...'
}}
Using expressions in the Options setting is a bit different, since the content of it is stringified on the data warehouse, and we can use column names between braces directly instead of concatenating strings, for example:
{ "method":"POST", "headers":{ "Content-Type":"application/json" }, "mode":"cors", "cache":"default", "body": "{\"lng\":
{{longitude_column}}
, \"lat\":
{{latitude_column}}
}" }
In the examples above, values from latitude
and longitude
columns have been used with expressions to generate a different URL /options for each row in the input table.
Allowed hosts [String] (Optional)
: When expressions or variables are used in the URL, you need to set a comma-separated list of allowed hosts for your requests. If expressions/variables are not used in URL, this setting is ignored.
Warning: API Key Security Including an API key or an authorization header in the URL or options object may expose them in data warehouse logs. Handle keys responsibly to prevent unauthorized access.
Outputs
Output table [Table]
: This components produces a table that contains all columns from the input table (if any was connected) plus a columnresponse_data
containing a string with the response's data.
External links
This component makes use of the functions in the http_request
module of the CARTO Analytics Toolbox. Check the links below for more specific documentation:
Output
Description
This component sets the node connected to it as output for API executions of the workflow.
Inputs
Source table [Table]
This component can only be used once on a workflow. The content of the node connected to it will be stored on a temporary table, specified in the API response when calling the execution of a workflow as: "workflowOutputTableName": "workflows-api-demo.workflows_temp.wfproc_f2f8df5df4ddf279_out_33afd785675f081d"
.
Find more information about how to define the output of a workflow here.
Save as Table
Description
This component writes an input table to a non-temporary location.
If a table exists under that location, it will be overwritten.
Inputs
Source table [Table]
FQN of table to create/overwrite [String]
Append [true/false]:
Determines whether the result will be appended to the existing table.
Outputs
Result table [Table]
Send by Email
This component requires the CARTO Analytics Toolbox installed in the chosen connection to build the workflow.
This component is not available in Self Hosted installations of the CARTO platform
Description
This components saves a table to a bucket and sends a notification with the saved table location.
It is limited to tables with a maximum size of 1GB. For larger tables, an exception will be thrown.
Inputs
Source table [Table]
Email address [String]
Subject [String]
Body [String]
File format [Selection]
Outputs
Email sent
Last updated