Importing data

CARTO allows you to import data from local files or URLs. This functionality is currently supported for the following Data Warehouses:

To import data into Amazon Redshift in Self-Hosted environments, you'll first need to configure your own S3 Bucket in Settings.

To import your data, go to Data Explorer and click the Import data button in the top right corner (or the import data icon if you have a connection or folder selected).

A new dialog will open where you can choose between uploading a local file (from your computer) or from a URL. Once you have selected your file, click Continue to select the location and name of the output table.

CARTO will define the schema of the resulting table by default, but you can turn the toggle off to manually define the schema. Custom schema definition might be needed in some cases. For example, postal codes are often interpreted as numbers when they should be strings. For more information on manual schema definition, check Defining a custom schema.

Once you have selected the location and table name, click Import to import your data. After importing, the table will be available in your selected connection. You can preview it in Data Explorer or create a map using Builder.

Write permissions

In order to import data, your connection needs to have write permissions on the chosen destination. If the connection does not the required permissions, a warning will appear. Read more about why we require certain permissions on connections.

Supported formats

CARTO currently supports importing CSV, GeoJSON, GeoPackage, KML, KMZ, TAB, Shapefiles (in a zip package), and GeoParquet files with at least two columns and up to 1 GB in size.

Cloud Optimized GeoTIFF raster files are also supported in Snowflake and BigQuery connections. To learn more, see our section on importing raster files.

For CSV files, CARTO will try and autodetect the geometry column or create the geometries from latitude/longitude columns. The supported column names are:

  • For geometry: geom,Geom,geometry,the_geom,wkt,wkb

  • For latitude: latitude,lat,Latitude

  • For longitude: longitude,lon,Lon,Longitude,lng,Lng

The expected delimiters are: comma (,), semi-colon (;) or a tabulation.

Manually defining schema

As explained above, you can choose between letting CARTO automatically define the schema for the output table or you can manually define it yourself. When manually defining a schema, here are the most important things to take into account:

  • The options available for each column are the native data types available for the destination data warehouse (eg: Google BigQuery, Snowflake...)

  • Object-like data types such as RECORD, STRUCT, ARRAY or OBJECT are not supported.

  • When using a custom schema, error tolerance will be 0.

To define a custom schema, just make sure to disable the Let CARTO automatically define the schema switch.

Error tolerance

By default, CARTO will work with the destination data warehouse to try and avoid failing an entire import process if only a small subset of rows are failing. This means that for a file with 10,000 rows, if one row fails, CARTO will successfully create a table with the remaining 9,999 rows.

This is the default error tolerance for each destination data warehouse:

  • BigQuery and CARTO Data Warehouse: up to 1,000 rows

  • Snowflake: unlimited rows

  • Redshift: up to 1,000 rows

  • PostgreSQL: 0 rows (importing will fail if a row throws an error)

Region-related errors

If your data warehouse is established on a different cloud region than your CARTO SaaS organization, imports might fail. This depends on limitations and setup on each data warehouse and cloud provider, and are not set by CARTO.

Last updated

Was this helpful?