Input / Output

Components that allow you to import or export data to/from CARTO and your connected cloud data warehouse.

Export to Bucket

Description

This components exports a table to a storage bucket.

For Self-Hosted deployments, a bucket owned by the customer needs to be configured. Please refer to this documentation for more information.

Inputs

  • Source table

Settings

  • File format: Select between CSV and JSON formats for the exported file(s).

  • Export compressed: Choose between a compressed file (.gz) or uncomprsesed.

  • GCS Bucket name: A bucket location provided by the user will be used by entering its name. Optionally a path can be added to the bucket name separated by slashes (/). The name of the bucket should follow the GCS specification and the name of the folder(s), in addition to GCS requirements should not contain single quotes (') or backslashes (). The connection (the email which appears on the connections page) should at least have permission for creating and deleting objects. To download the urls in a browser they should grant permissions to the email of an account they can login with. This custom bucket won't work with CARTO Data Warehouse connections unless the bucket is public.

Using this component requires that the connection has BigQuery User permission in the billing project.

Outputs

  • Result table: A table containing links to the result of the export.

Get Table by Name

Description

Load a table in the workflow, giving its name as a string. eg: db.schema.table.

Inputs

  • Source table reference: A string that contains the FQN of the table that will be loaded to the canvas.

Output

  • Result table

Import from URL

Description

This components imports a table from a URL.

This component needs to be executed (by running the workflow) before being able to connect . This process may take several minutes.

Inputs

  • Source URL: The public URL to be used to get the file that will be imported.

    • Check the "Automatically guess column data types in the imported table" checkbox to automatically guess data types on the input file. If disabled, all properties will be imported as a String.

Outputs

  • Output table: The component generates a table that contains the result of importing the file from "Source URL".

HTTP Request

Description

This component perform requests to external endpoints.

Inputs

  • Source table (Optional): If the component has an input table, columns from this input can be used within expressions in other settings.

Take into account that this component doesn't accept more than 10 rows as input. This limit is in place to ensure stability.

Settings

  • URL : The URL that will be used to perform the HTTP request

  • Options (Optional): JSON object with options for the HTTP Request. You can use this to set a different method than GET, specify headers or a body for your request. Follows the fetch() API options specification.

    • Example: { "method":"POST", "headers":{ "Content-Type":"application/json" }, "mode":"cors", "cache":"default", "body": "{ ...}" }

Both URL and Options settings allow the usage of expressions and variables:

  • This is an example of using values from an input's column in the URL, concatenating strings with values from columns, all within an expression enclosed with double curly braces: {{'https://your-api-domain.com/coordinates_endpoint?lat=' || latitude || '&lon=' || longitude || '&otherparams=...'}}

  • Using expressions in the Options setting is a bit different, since the content of it is stringified on the data warehouse, and we can use column names between braces directly instead of concatenating strings, for example: { "method":"POST", "headers":{ "Content-Type":"application/json" }, "mode":"cors", "cache":"default", "body": "{\"lng\": {{longitude_column}}, \"lat\":{{latitude_column}} }" }

In the examples above, values from latitude and longitude columns have been used with expressions to generate a different URL /options for each row in the input table.

  • Allowed hosts (Optional): When expressions or variables are used in the URL, you need to set a comma-separated list of allowed hosts for your requests. If expressions/variables are not used in URL, this setting is ignored.

Warning: API Key Security Including an API key or an authorization header in the URL or options object may expose them in data warehouse logs. Handle keys responsibly to prevent unauthorized access.

Outputs

  • Output table: This components produces a table that contains all columns from the input table (if any was connected) plus a column response_data containing a string with the response's data.

External links

This component makes use of the functions in the http_request module of the CARTO Analytics Toolbox. Check the links below for more specific documentation:

Output

Description

This component sets the node connected to it as output for API executions of the workflow.

Inputs

  • Source table

This component can only be used once on a workflow. The content of the node connected to it will be stored on a temporary table, specified in the API response when calling the execution of a workflow as: "workflowOutputTableName": "workflows-api-demo.workflows_temp.wfproc_f2f8df5df4ddf279_out_33afd785675f081d".

Find more information about how to define the output of a workflow here.

Save as Table

Description

This component writes an input table to a non-temporary location.

If a table exists under that location, it will be overwritten.

Inputs

  • Source table

Settings

  • FQN of table to create/overwrite: Use the UI to select a destination for the target table.

  • Append [true/false]: Determines whether the result will be appended to an existing table; or it will overwritten.

Provided FQNs need to follow convention on each data warehouse. Invalid names (according to each data warehouse constraints) will make the component fail. Check the documentation for each.

Outputs

  • This component doesn't have an output in Workflows, but instead generates a table in the specified destination.

Send by Email

This component requires the CARTO Analytics Toolbox installed in the chosen connection to build the workflow.

This component is not available in Self Hosted installations of the CARTO platform

Description

This component sends an email with a custom subject and body. Optionally, it can export data to a bucket and add the resulting link(s) to the email's body.

Inputs

  • Source table [Table]

Settings

  • Email address: The email address that will receive the email. This field support multiple addresses as separate recipients for the email.

  • Subject: The subject of the email.

  • Body: The content of the email.

  • Include data: If enabled, data from the input table will be exported to a bucket and resulting link(s) will be added to the email's body.

  • File format:

    • CSV

    • JSON

  • GCS Bucket name (Optional). If provided the data will be exported to a custom GCS bucket. Optionally a path can be added to the bucket name separated by slashes (/). The name of the bucket should follow the GCS specification and the name of the folder(s), in addition to GCS requirements should not contain single quotes (') or backslashes (). The connection (the email which appears on the connections page) should at least have permission for creating and deleting objects. To download the urls in a browser they should grant permissions to the email of an account they can login with. This custom bucket won't work with CARTO Data Warehouse connections unless the bucket is public.

Outputs

  • This component doesn't have an output that can be connected to another node.

Last updated