CARTO Workflows is currently in public beta, what are the plans to release the first version as “general availability”?

We plan to release the first official version of CARTO Workflows to general availability by the end of March 2023.

Is CARTO Workflows available in self-hosted deployments?

CARTO Workflows will be available for self-hosted deployment as soon as we launch the tool to general availability, which it is planned for March 2023. Please contact [email protected] if you would like to have more information about this topic.

Do I have to pay more to be able to use CARTO Workflows?

No, CARTO Workflows is a core tool of our platform and it is available in all subscription plans. Running workflows, in the same way as building and using maps, will have an impact on the computing processes that you execute in the CARTO platform and in the associated cloud data warehouse, which may be subject to specific usage-based quotas.

When creating a new workflow, I cannot see the data sources available in my connection - what may be happening?

This may be because the data warehouse connection associated with the workflow does not have the required permissions to run Workflows in the data warehouse, such as the permission of creating schemas in the Workflows temp. location (configured in the advanced options of the connection card). Please choose or create another connection with data owner permissions or modify the permissions in the current connection and try again. If the issue persists, please contact our support team at [email protected]

Working with my data sources from a BigQuery connection I receive an error message about not having permissions to query the table or that the table does not exist on a specific region - What may be happening?

In order to function, CARTO Workflows creates a temporal dataset in BigQuery named workflows_temp in where to store temporary objects needed to fully execute a workflow. In BigQuery, we create such dataset in the default region of the GCP project associated to the BigQuery connection. If you then want to include in a workflow data sources that are stored in another region different to the "default" one of your GCP project, then you need to create a new workflows_temp dataset in that other region and specify its location in the Advanced options of your BigQuery connection.

Working with my data sources from a Snowflake connection I receive the following error message: "cannot get Workflow schema" - What may be happening?

In order to guarantee a successful execution of a workflow via a Snowflake connection, please make sure that in the settings of the connection you have specified the database of your Snowflake account with which you want to work via that specific connection. This field is now required by CARTO Workflows. Note that you can edit an existing connection at any time.

My CARTO organization runs on a cloud region outside of the US, can I use data from my Data Observatory subscriptions via the CARTO Data Warehouse connection in a workflow?

CARTO Workflows in order to function requires a specific dataset in where to store temporary objects that are necessary to run a workflow. When you use the CARTO Data Warehouse connection, this dataset, named workflows_temp, is created in the default region where your account was provisioned, that can be in EU, APAC, etc. On the other hand, by default, the data from your Data Observatory subscriptions is shared with you in a dataset named carto-data that is in centralized in our US multi-region. Therefore, it is currently not compatible to use your Data Observatory subscriptions in workflows, if your CARTO Data Warehouse sits in a region outside the US. For now, if your account is outside of the US and you would like to leverage your data subscriptions in CARTO Workflows please contact our Technical Support team, and we will manually provision the data from your Data Observatory subscription in your region.