How to schedule a workflow
It is usual that once you have designed a workflow, you would like to schedule its execution periodically. This enables the results of a workflow to be automatically updated if the input data sources of a workflow have been modified between the last execution and the next one.
Currently, meanwhile we are working on developing the option for scheduling workflows directly from the Workflows UI, users can schedule workflows on their data warehouse by following these simple steps:
- 1.Go to the "SQL" tab of the results panel of your workflow.
- 2.Click on the "Copy" button at the top right of the SQL tab to copy the SQL code corresponding to your entire workflow.
- 3.The next steps depend on the data warehouse platform in which you are executing your workflow. Our recommendations on how to proceed from here in order to schedule the execution of the SQL code derived from the workflow on your data warehouse are the following:
BigQuery lets you schedule SQL queries execution using a built-in scheduler in the console. There is a step by step guide for creating a scheduled query available here. To learn more, check the documentation about BigQuery scheduled queries.
Snowflake makes use of Tasks to allow scheduled execution of SQL code. Tasks can be created via SQL statements using the CREATE TASK syntax. There are some detailed examples on how to create Tasks available in the Snowflake documentation.
Redshift allows creating a schedule to run SQL statements using the query editor. There are more details about how to use this feature here.
For PostgreSQL, the pg_cron extension provides the means to schedule the execution of a SQL query. Here are some resources to learn how to leverage pg_cron on different managed services based on PostgreSQL: