Configure your own buckets (Helm)
For CARTO Self-hosted using Kubernetes and Helm
Last updated
Was this helpful?
For CARTO Self-hosted using Kubernetes and Helm
Last updated
Was this helpful?
For every CARTO Self-Hosted installation, we need some configured buckets to store resources that will be used by the platform. These storage buckets are part of the required infrastructure for importing and exporting data, map thumbnails, customization assets (custom logos and markers) and other internal data.
You can create and use your own storage buckets in any of the following supported storage providers:
Create 2 buckets in your preferred Cloud provider:
Import Bucket
Thumbnails Bucket.
Create the data export bucket. This bucket has to be created in different storage providers depending on your data warehouse:
BigQuery:
Snowflake:
Redshift:
Amazon RDS:
For buckets created in AWS S3:
ACLs should be allowed.
CORS configuration: Thumbnails and Import buckets require having the following CORS headers configured.
Allowed origins: *
Allowed methods: GET
, PUT
, POST
Allowed headers (common): Content-Type
, Content-MD5
, Content-Disposition
, Cache-Control
GCS (extra): x-goog-content-length-range
, x-goog-meta-filename
Azure (extra): Access-Control-Request-Headers
, X-MS-Blob-Type
Max age: 3600
Generate credentials with Read/Write permissions to access those buckets, our supported authentication methods are:
GCS: Service Account Key
AWS: Access Key ID and Secret Access Key
Azure Blob: Access Key
In order to use Google Cloud Storage custom buckets you need to:
Add the following lines to your customizations.yaml and replace the <values>
with your own settings:
Note that thumbnailsBucketExternalURL could be https://storage.googleapis.com/<thumbnails_bucket_name>/ for public access or https://storage.cloud.google.com/<thumbnails_bucket_name>/ for authenticated access.
Select a Service Account that will be used by the application to interact with the buckets. There are two options:
Using a custom Service Account, that will be used not only for the buckets, but for the services deployed by CARTO as well. If you are using Workload Identity, that's your option.
Using a dedicated Service Account only for the buckets
Grant the selected Service Account with the role roles/iam.serviceAccountTokenCreator
in the GCP project where it was created.
⚠️ We don't recommend granting this role at project IAM level, but instead at the Service Account permissions level (IAM > Service Accounts >
your_service_account
> Permissions).
Grant the selected Service Account with the role roles/storage.admin
to the buckets created.
[OPTIONAL] Pass your GCP credentials as secrets: This is only required if you are going to use a dedicated Service Account only for the buckets
Option 1: Automatically create the secret:
appSecrets.googleCloudStorageServiceAccountKey.value
should be in plain text, preserving the multiline and correctly tabulated.
Option 2: Using existing secret: Create a secret running the command below, after replacing the <PATH_TO_YOUR_SECRET.json>
value with the path to the file of the Service Account:
Add the following lines to your customizations.yaml, without replacing any value:
To enable exporting data from BigQuery on CARTO Self-Hosted platform these are the required steps:
Define the name the bucket that will be used for exporting data in your customizations.yaml file:
Snowflake and Redshift require an AWS S3 bucket to export data from CARTO platform. These are the needed steps for allowing exporting data from CARTO Self-Hosted in these providers:
Create an IAM user and generate a programmatic key ID and secret. If server-side encryption is enabled, the user must be granted with permissions over the KMS key used.
Create an AWS IAM role with the following settings:
Trusted entity type: Custom trust policy
.
Custom trust policy: Make sure to replace <your_aws_user_arn>
.
Add permissions: Create a new permissions' policy, replacing <your_aws_s3_bucket_name>
.
Update your customizations.yaml file with the following values:
Pass your AWS credentials as secrets:
Option 1: Automatically create the secret:
appSecrets.exportAwsSecretAccessKey.value
andappSecrets.exportAwsAccessKeyId.value
be in plain text, preserving the multiline and correctly tabulated.
Option 2: Using existing secret: Create a secret running the command below:
Add the following lines to your customizations.yaml, without replacing any value:
If server-side encryption is enabled, the user must be granted with permissions over the KMS key following the .
How do I setup CORS configuration? Check the provider docs: , , .
Grant read/write permissions to the service account used by your CARTO Self-Hosted installation on the GCS bucket created in the .
The bucket to export data from Amazon RDS for PostgreSQL can be configured from the CARTO platform UI. Once your Self-Hosted installation is finished, you can check in the following documentation .