Importing data
CARTO allows you to import data from local files or URLs. This functionality is currently supported for the following Data Warehouses:
To import your data, go to Data Explorer and click the Import data button in the top right corner (or the import data icon if you have a connection or folder selected).
A new dialog will open where you can choose between uploading a local file (from your computer) or from a URL. Once you have selected your file, click Continue to select the location and name of the output table.
CARTO will define the schema of the resulting table by default, but you can turn the toggle off to manually define the schema. Custom schema definition might be needed in some cases. For example, postal codes are often interpreted as numbers when they should be strings. For more information on manual schema definition, check Defining a custom schema.
You can also drag and drop files directly into Data Explorer to trigger the import flow.
When you import data via a remote URL, the data is imported only once and will not stay in sync with the remote URL.

Once you have selected the location and table name, click Import to import your data. After importing, the table will be available in your selected connection. You can preview it in Data Explorer or create a map using Builder.
Write permissions
In order to import data, your connection needs to have write permissions on the chosen destination. If the connection does not the required permissions, a warning will appear. Read more about why we require certain permissions on connections.
Supported formats
CARTO currently supports importing CSV, GeoJSON, GeoPackage, KML, KMZ, TAB, Shapefiles (in a zip package), and GeoParquet files with at least two columns and up to 1 GB in size.
Cloud Optimized GeoTIFF raster files are also supported in Snowflake and BigQuery connections. To learn more, see our section on importing raster files.
TIP 💡
When working with zipped TAB or Shapefiles, use the sozip
utility that comes with GDAL to generate an optimized ZIP file that will be much faster to import. For example:
sozip output.shp.zip --optimize-from size_big_2.zip
For CSV files, CARTO will try and autodetect the geometry column or create the geometries from latitude/longitude columns. The supported column names are:
For geometry:
geom,Geom,geometry,the_geom,wkt,wkb
For latitude:
latitude,lat,Latitude
For longitude:
longitude,lon,Lon,Longitude,lng,Lng
The expected delimiters are: comma (,
), semi-colon (;
) or a tabulation.
Manually defining schema
As explained above, you can choose between letting CARTO automatically define the schema for the output table or you can manually define it yourself. When manually defining a schema, here are the most important things to take into account:
The options available for each column are the native data types available for the destination data warehouse (eg: Google BigQuery, Snowflake...)
Object-like data types such as
RECORD
,STRUCT
,ARRAY
orOBJECT
are not supported.When using a custom schema, error tolerance will be 0.
To define a custom schema, just make sure to disable the Let CARTO automatically define the schema switch.

Error tolerance
By default, CARTO will work with the destination data warehouse to try and avoid failing an entire import process if only a small subset of rows are failing. This means that for a file with 10,000 rows, if one row fails, CARTO will successfully create a table with the remaining 9,999 rows.
This is the default error tolerance for each destination data warehouse:
BigQuery and CARTO Data Warehouse: up to 1,000 rows
Snowflake: unlimited rows
Redshift: up to 1,000 rows
PostgreSQL: 0 rows (importing will fail if a row throws an error)
When using a custom schema, the error tolerance will be 0 rows in all cases. If you need to customize the error tolerance for your imports, consider using our Imports API.
Region-related errors
If your data warehouse is established on a different cloud region than your CARTO SaaS organization, imports might fail. This depends on limitations and setup on each data warehouse and cloud provider, and are not set by CARTO.
To check your CARTO SaaS region, go to Settings.
In Self-Hosted deployments this problem can be avoided by setting up your own import buckets in the correct region.
In SaaS deployments you can solve this problem for Amazon Redshift connections by customizing your S3 import bucket.
Last updated
Was this helpful?