Table and column capitalization
When working with CARTO, how your table and column names are handled depends on two factors: your data warehouse provider and how you access the data.
How you access data matters
Table sources (Data Explorer, Builder table source, Workflows)
CARTO handles quoting automatically, but has limitations with non-standard names
SQL Query sources (Builder SQL editor)
You control quoting — full flexibility but requires knowledge of provider rules
Provider-specific rules
Each data warehouse treats unquoted identifiers differently:
BigQuery
Case-sensitive (unchanged)
` (backticks)
Snowflake
Converted to UPPERCASE
" (double quotes)
PostgreSQL
Converted to lowercase
" (double quotes)
Redshift
Converted to lowercase
" (double quotes)
Databricks
Case-insensitive
` (backticks)
Known limitations with table sources
When loading tables in Data Explorer, Builder (table source), or Workflows, certain column naming patterns are not fully supported.
Unsupported patterns
Snowflake
Lowercase column names (created with quotes)
"myColumn", "revenue"
Use SQL Query source, or rename columns to UPPERCASE
PostgreSQL
Uppercase column names (created with quotes)
"MyColumn", "Revenue"
Use SQL Query source, or rename columns to lowercase
All providers
Columns with special characters
my-column, my column
Use SQL Query source, or rename columns
All providers
SQL reserved words as column names
select, from, where
Use SQL Query source, or rename columns
Why this happens: When you create a column with quotes in your warehouse (e.g., "myColumn" in Snowflake), the warehouse preserves that exact casing. However, when CARTO builds queries for table sources, it may not apply the required quoting for these edge cases.
Using SQL Query sources
SQL Query sources give you full control over identifier quoting. Use the appropriate quote character for your warehouse:
Snowflake
Double quotes preserve lowercase or mixed case:
PostgreSQL and Redshift
Double quotes preserve uppercase or mixed case:
BigQuery
Backticks for all identifiers; case is always preserved:
Databricks
Backticks for special characters; case doesn't matter:
Best practices
To ensure compatibility across all CARTO features, use naming conventions that match your provider's default behavior:
Snowflake
UPPERCASE or UPPER_SNAKE_CASE
PostgreSQL / Redshift
lowercase or snake_case
BigQuery
Any consistent style (case-sensitive)
Databricks
Any style (snake_case recommended)
General guidelines:
Use
snake_casewith letters matching your provider's default (uppercase for Snowflake, lowercase for PostgreSQL)Avoid spaces, hyphens, and special characters in names
Avoid SQL reserved words (
select,from,where,order,group, etc.)If you must use non-standard names, access the data via SQL Query source
Troubleshooting
"Column not found" in Snowflake
Column was created with lowercase (quoted)
Use SQL Query source with proper quoting: "myColumn"
"Column does not exist" in PostgreSQL
Column was created with uppercase (quoted)
Use SQL Query source with proper quoting: "MyColumn"
Table loads but some columns missing
Column names use reserved words or special characters
Use SQL Query source
Works in warehouse console, fails in CARTO table view
Non-standard identifier naming
Use SQL Query source instead
Last updated
Was this helpful?
