Last updated
Was this helpful?
Last updated
Was this helpful?
This notebook guides the user through the process for connecting to both CARTO and Snowflake accounts and leverage CARTO’s Analytics Toolbox and CARTO’s integration with Pydeck to be able to perform spatial analytics at scale and create map visualizations from Python notebooks. You can find the original notebook .
The outline of this notebooks is as follows:
Authentication to CARTO: to be able to use ‘CartoLayer’ in Pydeck;
Authentication to Snowflake (credentials that have access to the database connected to CARTO with the Analytics Toolbox installed)
Operations and analysis using Snowpark Python connector and CARTO’s Analytics Toolbox
Map visualizations with CARTO and Pydeck
NOTE: snowflake-snowpark-python is only compatible with python >= 3.8, so be sure to run the notebook in an appropriate environment
In order to authenticate to your CARTO account, install the carto_oauth
package and use it to login with your credentials.
The cell below creates an .env file with the environment variables used for connecting to snowflake
We load our Snowflake credentials from the environment with os
to create a Python connector to Snowpark
“Crossfit” is a gym chain located in California. We will be running a location analysis of “Crossfit” venues vs its competitors.
We can export directly the output of a query as a pandas dataframe. The geometry column is downloaded as geojson text
We transform our current dataframe, and we upload it back into our Snowflake database
Here we visualize the uploaded data in two layers, using the new styling functions and the Analytics Toolbox installed in SF.
hexagons: renders the h3 cells with a colour continuos style representing the dominance ratio of crossfit gyms vs total number of gyms
points: plots the location of the gyms, with a color category style representing the gym type (crossfit gyms vs competition gyms)
We use the h3
module in to compute the H3 cell of each gym in the “Crossfit” and “Competition” tables, we then join them by h3 id and download the data.