LogoLogo
HomeAcademyLoginTry for free
  • Welcome
  • What's new
    • Q2 2025
    • Q1 2025
    • Q4 2024
    • Q3 2024
    • Q2 2024
    • Q1 2024
    • Q4 2023
    • Q3 2023
    • Q2 2023
    • Q1 2023
    • Q4 2022
    • Q3 2022
  • FAQs
    • Accounts
    • Migration to the new platform
    • User & organization setup
    • General
    • Builder
    • Workflows
    • Data Observatory
    • Analytics Toolbox
    • Development Tools
    • Deployment Options
    • CARTO Basemaps
    • CARTO for Education
    • Support Packages
    • Security and Compliance
  • Getting started
    • What is CARTO?
    • Quickstart guides
      • Connecting to your data
      • Creating your first map
      • Creating your first workflow
      • Developing your first application
    • CARTO Academy
  • CARTO User Manual
    • Overview
      • Creating your CARTO organization
      • CARTO Cloud Regions
      • CARTO Workspace overview
    • Maps
      • Data sources
        • Simple features
        • Spatial Indexes
        • Pre-generated tilesets
        • Rasters
        • Defining source spatial data
        • Managing data freshness
        • Changing data source location
      • Layers
        • Point
          • Grid point aggregation
          • H3 point aggregation
          • Heatmap point aggregation
          • Cluster point aggregation
        • Polygon
        • Line
        • Grid
        • H3
        • Raster
        • Zoom to layer
      • Widgets
        • Formula widget
        • Category widget
        • Pie widget
        • Histogram widget
        • Range widget
        • Time Series widget
        • Table widget
      • SQL Parameters
        • Date parameter
        • Text parameter
        • Numeric parameter
        • Publishing SQL parameters
      • Interactions
      • Legend
      • Basemaps
        • Basemap selector
      • AI Agents
      • SQL analyses
      • Map view modes
      • Map description
      • Feature selection tool
      • Search locations
      • Measure distances
      • Exporting data
      • Download PDF reports
      • Managing maps
      • Sharing and collaboration
        • Editor collaboration
        • Map preview for editors
        • Map settings for viewers
        • Comments
        • Embedding maps
        • URL parameters
      • Performance considerations
    • Workflows
      • Workflow canvas
      • Results panel
      • Components
        • Aggregation
        • Custom
        • Data Enrichment
        • Data Preparation
        • Generative AI
        • Input / Output
        • Joins
        • Parsers
        • Raster Operations
        • Spatial Accessors
        • Spatial Analysis
        • Spatial Constructors
        • Spatial Indexes
        • Spatial Operations
        • Statistics
        • Tileset Creation
        • BigQuery ML
        • Snowflake ML
        • Google Earth Engine
        • Google Environment APIs
        • Telco Signal Propagation Models
      • Data Sources
      • Scheduling workflows
      • Sharing workflows
      • Using variables in workflows
      • Executing workflows via API
      • Temporary data in Workflows
      • Extension Packages
      • Managing workflows
      • Workflows best practices
    • Data Explorer
      • Creating a map from your data
      • Importing data
        • Importing rasters
      • Geocoding data
      • Optimizing your data
    • Data Observatory
      • Terminology
      • Browsing the Spatial Data Catalog
      • Subscribing to public and premium datasets
      • Accessing free data samples
      • Managing your subscriptions
      • Accessing your subscriptions from your data warehouse
        • Access data in BigQuery
        • Access data in Snowflake
        • Access data in Databricks
        • Access data in Redshift
        • Access data in PostgreSQL
    • Connections
      • Google BigQuery
      • Snowflake
      • Databricks
      • Amazon Redshift
      • PostgreSQL
      • CARTO Data Warehouse
      • Sharing connections
      • Deleting a connection
      • Required permissions
      • IP whitelisting
      • Customer data responsibilities
    • Applications
    • Settings
      • Understanding your organization quotas
      • Activity Data
        • Activity Data Reference
        • Activity Data Examples
        • Activity Data Changelog
      • Users and Groups
        • Inviting users to your organization
        • Managing user roles
        • Deleting users
        • SSO
        • Groups
        • Mapping groups to user roles
      • CARTO Support Access
      • Customizations
        • Customizing appearance and branding
        • Configuring custom color palettes
        • Configuring your organization basemaps
        • Enabling AI Agents
      • Advanced Settings
        • Managing applications
        • Configuring S3 Bucket for Redshift Imports
        • Configuring OAuth connections to Snowflake
        • Configuring OAuth U2M connections to Databricks
        • Configuring S3 Bucket integration for RDS for PostgreSQL Exports in Builder
        • Configuring Workload Identity Federation for BigQuery
      • Data Observatory
      • Deleting your organization
    • Developers
      • Managing Credentials
        • API Base URL
        • API Access Tokens
        • SPA OAuth Clients
        • M2M OAuth Clients
      • Named Sources
  • Data and Analysis
    • Analytics Toolbox Overview
    • Analytics Toolbox for BigQuery
      • Getting access
        • Projects maintained by CARTO in different BigQuery regions
        • Manual installation in your own project
        • Installation in a Google Cloud VPC
        • Core module
      • Key concepts
        • Tilesets
        • Spatial indexes
      • SQL Reference
        • accessors
        • clustering
        • constructors
        • cpg
        • data
        • http_request
        • import
        • geohash
        • h3
        • lds
        • measurements
        • placekey
        • processing
        • quadbin
        • random
        • raster
        • retail
        • routing
        • s2
        • statistics
        • telco
        • tiler
        • transformations
      • Guides
        • Running queries from Builder
        • Working with Raster data
      • Release notes
      • About Analytics Toolbox regions
    • Analytics Toolbox for Snowflake
      • Getting access
        • Native App from Snowflake's Marketplace
        • Manual installation
      • Key concepts
        • Spatial indexes
        • Tilesets
      • SQL Reference
        • accessors
        • clustering
        • constructors
        • data
        • http_request
        • import
        • h3
        • lds
        • measurements
        • placekey
        • processing
        • quadbin
        • random
        • raster
        • retail
        • s2
        • statistics
        • tiler
        • transformations
      • Guides
        • Running queries from Builder
        • Working with Raster data
      • Release Notes
    • Analytics Toolbox for Databricks
      • Getting access
        • Personal (former Single User) cluster
        • Standard (former Shared) cluster
      • Reference
        • lds
        • tiler
      • Guides
      • Release Notes
    • Analytics Toolbox for Redshift
      • Getting access
        • Manual installation in your database
        • Installation in an Amazon Web Services VPC
        • Core version
      • Key concepts
        • Tilesets
        • Spatial indexes
      • SQL Reference
        • clustering
        • constructors
        • data
        • http_request
        • import
        • lds
        • placekey
        • processing
        • quadbin
        • random
        • s2
        • statistics
        • tiler
        • transformations
      • Guides
        • Running queries from Builder
      • Release Notes
    • Analytics Toolbox for PostgreSQL
      • Getting access
        • Manual installation
        • Core version
      • Key concepts
        • Tilesets
        • Spatial Indexes
      • SQL Reference
        • h3
        • quadbin
        • tiler
      • Guides
        • Creating spatial index tilesets
        • Running queries from Builder
      • Release Notes
    • CARTO + Python
      • Installation
      • Authentication Methods
      • Visualizing Data
      • Working with Data
        • How to work with your data in the CARTO Data Warehouse
        • How to access your Data Observatory subscriptions
        • How to access CARTO's Analytics Toolbox for BigQuery and create visualizations via Python notebooks
        • How to access CARTO’s Analytics Toolbox for Snowflake and create visualizations via Python notebooks
        • How to visualize data from Databricks
      • Reference
    • CARTO QGIS Plugin
  • CARTO for Developers
    • Overview
    • Key concepts
      • Architecture
      • Libraries and APIs
      • Authentication methods
        • API Access Tokens
        • OAuth Access Tokens
        • OAuth Clients
      • Connections
      • Data sources
      • Visualization with deck.gl
        • Basemaps
          • CARTO Basemap
          • Google Maps
            • Examples
              • Gallery
              • Getting Started
              • Basic Examples
                • Hello World
                • BigQuery Tileset Layer
                • Data Observatory Tileset Layer
              • Advanced Examples
                • Arc Layer
                • Extrusion
                • Trips Layer
            • What's New
          • Amazon Location
            • Examples
              • Hello World
              • CartoLayer
            • What's New
        • Rapid Map Prototyping
      • Charts and widgets
      • Filtering and interactivity
      • Summary
    • Quickstart
      • Make your first API call
      • Visualize your first dataset
      • Create your first widget
    • Guides
      • Build a public application
      • Build a private application
      • Build a private application using SSO
      • Visualize massive datasets
      • Integrate CARTO in your existing application
      • Use Boundaries in your application
      • Avoid exposing SQL queries with Named Sources
      • Managing cache in your CARTO applications
    • Reference
      • Deck (@deck.gl reference)
      • Data Sources
        • vectorTableSource
        • vectorQuerySource
        • vectorTilesetSource
        • h3TableSource
        • h3QuerySource
        • h3TilesetSource
        • quadbinTableSource
        • quadbinQuerySource
        • quadbinTilesetSource
        • rasterSource
        • boundaryTableSource
        • boundaryQuerySource
      • Layers (@deck.gl/carto)
      • Widgets
        • Data Sources
        • Server-side vs. client-side
        • Models
          • getFormula
          • getCategories
          • getHistogram
          • getRange
          • getScatter
          • getTimeSeries
          • getTable
      • Filters
        • Column filters
        • Spatial filters
      • CARTO APIs Reference
    • Release Notes
    • Examples
    • CARTO for React
      • Guides
        • Getting Started
        • Views
        • Data Sources
        • Layers
        • Widgets
        • Authentication and Authorization
        • Basemaps
        • Look and Feel
        • Query Parameters
        • Code Generator
        • Sample Applications
        • Deployment
        • Upgrade Guide
      • Examples
      • Library Reference
        • Introduction
        • API
        • Auth
        • Basemaps
        • Core
        • Redux
        • UI
        • Widgets
      • Release Notes
  • CARTO Self-Hosted
    • Overview
    • Key concepts
      • Architecture
      • Deployment requirements
    • Quickstarts
      • Single VM deployment (Kots)
      • Orchestrated container deployment (Kots)
      • Advanced Orchestrated container deployment (Helm)
    • Guides
      • Guides (Kots)
        • Configure your own buckets
        • Configure an external in-memory cache
        • Enable Google Basemaps
        • Enable the CARTO Data Warehouse
        • Configure an external proxy
        • Enable BigQuery OAuth connections
        • Configure Single Sign-On (SSO)
        • Use Workload Identity in GCP
        • High availability configuration for CARTO Self-hosted
        • Configure your custom service account
      • Guides (Helm)
        • Configure your own buckets (Helm)
        • Configure an external in-memory cache (Helm)
        • Enable Google Basemaps (Helm)
        • Enable the CARTO Data Warehouse (Helm)
        • Configure an external proxy (Helm)
        • Enable BigQuery OAuth connections (Helm)
        • Configure Single Sign-On (SSO) (Helm)
        • Use Workload Identity in GCP (Helm)
        • Use EKS Pod Identity in AWS (Helm)
        • Enable Redshift imports (Helm)
        • Migrating CARTO Self-hosted installation to an external database (Helm)
        • Advanced customizations (Helm)
        • Configure your custom service account (Helm)
    • Maintenance
      • Maintenance (Kots)
        • Updates
        • Backups
        • Uninstall
        • Rotating keys
        • Monitoring
        • Change the Admin Console password
      • Maintenance (Helm)
        • Monitoring (Helm)
        • Rotating keys (Helm)
        • Uninstall (Helm)
        • Backups (Helm)
        • Updates (Helm)
    • Support
      • Get debug information for Support (Kots)
      • Get debug information for Support (Helm)
    • CARTO Self-hosted Legacy
      • Key concepts
        • Architecture
        • Deployment requirements
      • Quickstarts
        • Single VM deployment (docker-compose)
      • Guides
        • Configure your own buckets
        • Configure an external in-memory cache
        • Enable Google Basemaps
        • Enable the CARTO Data Warehouse
        • Configure an external proxy
        • Enable BigQuery OAuth connections
        • Configure Single Sign-On (SSO)
        • Enable Redshift imports
        • Configure your custom service account
        • Advanced customizations
        • Migrating CARTO Self-Hosted installation to an external database
      • Maintenance
        • Updates
        • Backups
        • Uninstall
        • Rotating keys
        • Monitoring
      • Support
    • Release Notes
  • CARTO Native App for Snowflake Containers
    • Deploying CARTO using Snowflake Container Services
  • Get Help
    • Legal & Compliance
    • Previous libraries and components
    • Migrating your content to the new CARTO platform
Powered by GitBook
On this page
  • Pre-requisites
  • Setup
  • Import and Thumbnails buckets
  • Data export bucket

Was this helpful?

Export as PDF
  1. CARTO Self-Hosted
  2. Guides
  3. Guides (Helm)

Configure your own buckets (Helm)

For CARTO Self-hosted using Kubernetes and Helm

PreviousGuides (Helm)NextConfigure an external in-memory cache (Helm)

Last updated 3 months ago

Was this helpful?

This documentation only applies to advanced Orchestrated container deployments using Kubernetes and Helm

For every CARTO Self-Hosted installation, we need some configured buckets to store resources that will be used by the platform. These storage buckets are part of the required infrastructure for importing and exporting data, map thumbnails, customization assets (custom logos and markers) and other internal data.

You can create and use your own storage buckets in any of the following supported storage providers:

Pre-requisites

  1. Create 2 buckets in your preferred Cloud provider:

    • Import Bucket

    • Thumbnails Bucket.

There're no name constraints.

Map thumbnails storage objects (.png files) can be configured to be public (default) or private. In order to change this, set WORKSPACE_THUMBNAILS_PUBLIC="false". Some features, such as branding and custom markers, won't work unless the bucket is public. However, there's a workaround to avoid making the whole bucket public, which requires allowing public objects, allowing ACLs (or non-uniform permissions) and disabling server-side encryption.

  1. Create the data export bucket. This bucket has to be created in different storage providers depending on your data warehouse:

    • BigQuery:

    • Snowflake:

    • Redshift:

    • Amazon RDS:

For buckets created in AWS S3:

  • ACLs should be allowed.

  1. CORS configuration: Thumbnails and Import buckets require having the following CORS headers configured.

    • Allowed origins: *

    • Allowed methods: GET, PUT, POST

    • Allowed headers (common): Content-Type, Content-MD5, Content-Disposition, Cache-Control

    • GCS (extra): x-goog-content-length-range, x-goog-meta-filename

    • Azure (extra): Access-Control-Request-Headers, X-MS-Blob-Type

    • Max age: 3600

CORS is configured at bucket level in GCS and S3, and at storage account level in Azure.

  1. Generate credentials with Read/Write permissions to access those buckets, our supported authentication methods are:

    • GCS: Service Account Key

    • AWS: Access Key ID and Secret Access Key

    • Azure Blob: Access Key

Setup

Import and Thumbnails buckets

In order to use Google Cloud Storage custom buckets you need to:

  1. Add the following lines to your customizations.yaml and replace the <values> with your own settings:

appConfigValues:
  storageProvider: "gcp"
  workspaceImportsBucket: <import_bucket_name>
  workspaceImportsPublic: <false|true>
  workspaceThumbnailsBucket: <thumbnails_bucket_name>
  workspaceThumbnailsPublic: <false|true>
  thumbnailsBucketExternalURL: <public or authenticated external bucket URL>
  googleCloudStorageProjectId: <gcp_project_id>

Note that thumbnailsBucketExternalURL could be https://storage.googleapis.com/<thumbnails_bucket_name>/ for public access or https://storage.cloud.google.com/<thumbnails_bucket_name>/ for authenticated access.

  1. Select a Service Account that will be used by the application to interact with the buckets. There are two options:

    1. Using a custom Service Account, that will be used not only for the buckets, but for the services deployed by CARTO as well. If you are using Workload Identity, that's your option.

    2. Using a dedicated Service Account only for the buckets

  2. Grant the selected Service Account with the role roles/iam.serviceAccountTokenCreator in the GCP project where it was created.

⚠️ We don't recommend granting this role at project IAM level, but instead at the Service Account permissions level (IAM > Service Accounts > your_service_account > Permissions).

  1. Grant the selected Service Account with the role roles/storage.admin to the buckets created.

  2. [OPTIONAL] Pass your GCP credentials as secrets: This is only required if you are going to use a dedicated Service Account only for the buckets

    • Option 1: Automatically create the secret:

      appSecrets:
        googleCloudStorageServiceAccountKey:
          value: |
            <REDACTED>

    appSecrets.googleCloudStorageServiceAccountKey.value should be in plain text, preserving the multiline and correctly tabulated.

    • Option 2: Using existing secret: Create a secret running the command below, after replacing the <PATH_TO_YOUR_SECRET.json> value with the path to the file of the Service Account:

      kubectl create secret generic \
        [-n my-namespace] \
        mycarto-google-storage-service-account \
        --from-file=key=<PATH_TO_YOUR_SECRET.json>

      Add the following lines to your customizations.yaml, without replacing any value:

      appSecrets:
        googleCloudStorageServiceAccountKey:
          existingSecret:
            name: mycarto-google-storage-service-account
            key: key

In order to use AWS S3 custom buckets you need to:

  1. Create an IAM user and generate a programmatic key ID and secret.

  2. Grant this user with read/write access permissions over the buckets. If server-side encryption is enabled, the user must be granted with permissions over the KMS key used.

  3. Add the following lines to your customizations.yaml and replace the <values> with your own settings:

appConfigValues:
  storageProvider: "s3"
  workspaceImportsBucket: <import_bucket_name>
  workspaceImportsPublic: <false|true>
  workspaceThumbnailsBucket: <thumbnails_bucket_name>
  workspaceThumbnailsPublic: <false|true>
  thumbnailsBucketExternalURL: <external bucket URL>
  awsS3Region: <s3_buckets_region>

Note that thumbnailsBucketExternalURL should be https://<thumbnails_bucket_name>.s3.amazonaws.com/

  1. Pass your AWS credentials as secrets by using one of the options below:

    • Option1: Automatically create a secret

      Add the following lines to your customizations.yaml replacing it with your access key values:

      appSecrets:
        awsAccessKeyId:
          value: "<REDACTED>"
        awsAccessKeySecret:
          value: "<REDACTED>"

      appSecrets.awsAccessKeyId.value and appSecrets.awsAccessKeySecret.value should be in plain text

    • Option 2: Using an existing secret

      Create a secret running the command below, after replacing the <REDACTED> values with your key values:

      kubectl create secret generic \
        [-n my-namespace] \
        mycarto-custom-s3-secret \
        --from-literal=awsAccessKeyId=<REDACTED> \
        --from-literal=awsSecretAccessKey=<REDACTED>

      Use the same namespace where you are installing the helm chart

      Add the following lines to your customizations.yaml, without replacing any value:

      appSecrets:
        awsAccessKeyId:
          existingSecret:
            name: mycarto-custom-s3-secret
            key: awsAccessKeyId
        awsAccessKeySecret:
          existingSecret:
            name: mycarto-custom-s3-secret
            key: awsSecretAccessKey

In order to use Azure Storage buckets you need to:

  1. Generate an Access Key, from the storage account's Security properties.

  2. Add the following lines to your customizations.yaml and replace the <values> with your own settings:

appConfigValues:
  storageProvider: "azure-blob"
  azureStorageAccount: <storage_account_name>
  workspaceImportsBucket: <import_bucket_name>
  workspaceImportsPublic: <false|true>
  workspaceThumbnailsBucket: <thumbnails_bucket_name>
  thumbnailsBucketExternalURL: <external bucket URL>
  workspaceThumbnailsPublic: <false|true>

Note that thumbnailsBucketExternalURL should be https://<azure_resource_group>.blob.core.windows.net/<thumbnails_bucket_name>/

  1. Pass your credentials as secrets by using one of the options below:

    • Option 1: Automatically create the secret: Add the following lines to your customizations.yaml replacing it with your access key value:

      appSecrets:
        azureStorageAccessKey:
          value: "<REDACTED>"

      appSecrets.azureStorageAccessKey.value should be in plain text

    • Option 2: Using existing secret: Create a secret running the command below, after replacing the <REDACTED> values with your key values:

      kubectl create secret generic \
        [-n my-namespace] \
        mycarto-custom-azure-secret \
        --from-literal=azureStorageAccessKey=<REDACTED>

      Use the same namespace where you are installing the helm chart

      Add the following lines to your customizations.yaml, without replacing any value:

      appSecrets:
        awsAccessKeyId:
          existingSecret:
            name: mycarto-custom-azure-secret
            key: azureStorageAccessKey

Data export bucket

Configure data export bucket for BigQuery

To enable exporting data from BigQuery on CARTO Self-Hosted platform these are the required steps:

  1. Define the name the bucket that will be used for exporting data in your customizations.yaml file:

appConfigValues:
    workspaceExportsBucket: <YOUR_EXPORTS_BUCKET>

Configure data exports in Snowflake and Redshift

Snowflake and Redshift require an AWS S3 bucket to export data from CARTO platform. These are the needed steps for allowing exporting data from CARTO Self-Hosted in these providers:

  1. Create an IAM user and generate a programmatic key ID and secret. If server-side encryption is enabled, the user must be granted with permissions over the KMS key used.

If you've already configured the Import and Thumbnails buckets using AWS S3, you can use the same user you already created for these buckets.

  1. Create an AWS IAM role with the following settings:

    1. Trusted entity type: Custom trust policy.

    2. Custom trust policy: Make sure to replace <your_aws_user_arn>.

    {
      "Version": "2012-10-17",
      "Statement": [
          {
              "Effect": "Allow",
              "Principal": {
                  "AWS": "<your_aws_user_arn>"
              },
              "Action": [
                  "sts:AssumeRole",
                  "sts:TagSession"
              ]
          }
      ]
    }
    1. Add permissions: Create a new permissions' policy, replacing <your_aws_s3_bucket_name>.

    {
       "Version": "2012-10-17",
       "Statement": [
           {
               "Effect": "Allow",
               "Action": "s3:ListBucket",
               "Resource": "arn:aws:s3:::<your_aws_s3_bucket_name>"
           },
           {
               "Effect": "Allow",
               "Action": "s3:*Object",
               "Resource": "arn:aws:s3:::<your_aws_s3_bucket_name>/*"
           }
       ]
    }
  2. Update your customizations.yaml file with the following values:

appConfigValues:
    awsExportBucket: <BUCKET_NAME>
    awsExportBucketRegion: <REGION>
    exportAwsRoleArn: <ROLE_ARN>

Pass your AWS credentials as secrets:

  • Option 1: Automatically create the secret:

    appSecrets:
      exportAwsSecretAccessKey:
        value: <REDACTED>
      exportAwsAccessKeyId:
        value: <REDACTED>

appSecrets.exportAwsSecretAccessKey.value and appSecrets.exportAwsAccessKeyId.value be in plain text, preserving the multiline and correctly tabulated.

  • Option 2: Using existing secret: Create a secret running the command below:

    kubectl create secret generic \
      [-n my-namespace] \
      mycarto-export-aws-access-key \
      --from-literal=key-id=<ACCESS_KEY_ID>
      --from-literal=key-secret=<ACCESS_KEY_SECRET>

    Add the following lines to your customizations.yaml, without replacing any value:

    appSecrets:
      exportAwsSecretAccessKey:
        existingSecret:
          name: mycarto-export-aws-access-key
          key: key-id
      exportAwsAccessKeyId:
        existingSecret:
          name: mycarto-export-aws-access-key-id
          key: key-secret

Configure data exports in Amazon RDS for PostgreSQL

If server-side encryption is enabled, the user must be granted with permissions over the KMS key following the .

How do I setup CORS configuration? Check the provider docs: , , .

Grant read/write permissions to the service account used by your CARTO Self-Hosted installation on the GCS bucket created in the .

The bucket to export data from Amazon RDS for PostgreSQL can be configured from the CARTO platform UI. Once your Self-Hosted installation is finished, you can check in the following documentation .

Google Cloud Storage
AWS S3
Azure Blob Storage
Google Cloud Storage
AWS S3
AWS S3
AWS S3
AWS documentation
GCS
AWS S3
Azure Blob Storage
how to configure your S3 bucket integration for Amazon RDS for PostgreSQL
pre-requisites