# CARTO AI

CARTO AI adds AI capabilities across the CARTO platform. This includes [AI Agents](https://docs.carto.com/carto-user-manual/ai-agents) in Builder for natural-language interaction with maps, along with other AI features where supported.

## Enabling CARTO AI

To enable AI capabilities in your platform, navigate to CARTO AI section within your organization's setting and toggle **Enable CARTO AI**.

<figure><img src="https://3029946802-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FybPdpmLltPkzGFvz7m8A%2Fuploads%2FE2TftTv3F74twhmzO6vE%2FScreenshot%202026-03-19%20at%2016.57.40.png?alt=media&#x26;token=8b8e8788-b2d9-4ae5-83c9-d4544ad40282" alt=""><figcaption></figcaption></figure>

{% hint style="info" %}
By enabling CARTO AI, you agree to the [CARTO AI terms and conditions.](https://drive.google.com/file/d/1QqidTLrDNqMbWg0D1RbHC3lkOSOHVXam/view?usp=drive_link)
{% endhint %}

## Default Model

Once CARTO AI is enabled, you can select a default model that will be used across all CARTO AI features. This model is automatically applied when using or configuring AI-powered capabilities.

To set the default model:

* Navigate to Settings > CARTO AI
* Ensure CARTO AI is enabled
* Use the Default model dropdown to select your preferred model

You can change the default model at any time. The available models in the dropdown include either CARTO-managed models or any custom models you've configured from your own AI providers.

{% hint style="info" %}
The default model is pre-selected when configuring or using AI-powered features. However, Editor users can override this selection and choose a different model as needed.
{% endhint %}

## Model Options

### CARTO managed models (default)

When you enable CARTO AI, these models are available immediately:

* **claude-opus-4.6:** Recommended default for geospatial use cases across CARTO AI. Best-in-class reasoning, spatial analysis, SQL generation, and high-accuracy insights.
* **claude-sonnet-4-6**: High-performance model with strong reasoning capabilities. Good balance of quality and efficiency.
* **gemini-3-pro**: Advanced Gemini model with strong geospatial reasoning and multi-modal capabilities.
* **gemini-3-flash**: Fast and efficient Gemini model, ideal for simpler queries and high-volume interactions.

### Bring your own model

Organizations can configure their own AI models from multiple providers, giving you full control over model selection, data residency, and cost management.

#### Supported Providers

<table><thead><tr><th width="195.63671875">Provider</th><th width="222.140625">Authentication</th><th>Supported Models</th></tr></thead><tbody><tr><td><a href="#google-vertex-ai">Google Vertex AI</a></td><td>Service Account</td><td><code>gemini-3-pro</code>, <code>gemini-3-flash</code>, <code>gemini-2.5-pro</code>, <code>gemini-2.5-flash</code>, <code>claude-opus-4.6</code>, <code>claude-sonnet-4.6</code>, <code>claude-opus-4.5</code>, <code>claude-sonnet-4.5</code></td></tr><tr><td><a href="#google-ai-studio">Google AI Studio</a></td><td>API Key</td><td><code>gemini-3-pro</code>, <code>gemini-3-flash</code>, <code>gemini-2.5-pro</code>, <code>gemini-2.5-flash</code></td></tr><tr><td><a href="#openai">OpenAI</a></td><td>API Key</td><td><code>gpt-5.2-pro</code>, <code>gpt-5.2</code>, <code>gpt-5</code>, <code>gpt-5-mini</code>, <code>gpt-4o</code>, <code>gpt-4o-mini</code></td></tr><tr><td><a href="#aws-bedrock">AWS Bedrock</a></td><td>AWS Credentials</td><td><code>claude-opus-4.6</code>, <code>claude-sonnet-4.6</code>, <code>claude-opus-4.5</code>, <code>claude-sonnet-4.5</code>, <code>claude-haiku-4.5</code></td></tr><tr><td><a href="#anthropic">Anthropic</a></td><td>API Key</td><td><code>claude-opus-4.6</code>, <code>claude-sonnet-4.6</code>, <code>claude-opus-4.5</code>, <code>claude-sonnet-4.5</code>, <code>claude-haiku-4.5</code></td></tr><tr><td><a href="#azure-openai">Azure OpenAI</a></td><td>API Key + Endpoint</td><td><code>gpt-5.2</code>, <code>gpt-5</code>, <code>gpt-5-mini</code>, <code>gpt-4o</code>, <code>gpt-4o-mini</code></td></tr><tr><td><a href="#snowflake-cortex">Snowflake Cortex</a></td><td>PAT Token</td><td><code>claude-opus-4.6</code>, <code>claude-sonnet-4.6</code>, <code>claude-opus-4.5</code>, <code>claude-sonnet-4.5</code>, <code>claude-haiku-4.5</code>, <code>openai-gpt-5</code>, <code>openai-gpt-5-mini</code></td></tr><tr><td><a href="#databricks-model-serving">Databricks Model Serving</a></td><td>PAT Token</td><td><code>databricks-claude-opus-4.6</code>, <code>databricks-claude-sonnet-4.6</code>, <code>databricks-claude-opus-4.5</code>, <code>databricks-claude-sonnet-4.5</code>, <code>databricks-claude-haiku-4.5</code>, <code>databricks-gemini-3-pro</code>, <code>databricks-gemini-3-flash</code>, <code>databricks-gemini-2.5-pro</code>, <code>databricks-gemini-2.5-flash</code>, <code>databricks-gpt-5.2</code>, <code>databricks-gpt-5</code>, <code>databricks-gpt-5-mini</code></td></tr><tr><td><a href="#oracle-generative-ai">Oracle Generative AI</a></td><td>OCI Credentials</td><td><code>google.gemini-2.5-pro</code>, <code>google.gemini-2.5-flash</code>, <code>xai.grok-4</code></td></tr></tbody></table>

#### OpenAI

Connect directly to OpenAI's API.

**Supported models**: `gpt-5.2-pro`, `gpt-5.2`, `gpt-5`, `gpt-5-mini`, `gpt-4o`, `gpt-4o-mini`

**Configuration**:

* **API Key** (required): Your OpenAI API Key
* **Base URL** (optional): Custom API endpoint URL

***

#### Anthropic

Connect directly to Anthropic's API.

**Supported models**: `claude-opus-4.6`, `claude-sonnet-4.6`, `claude-opus-4.5`, `claude-sonnet-4.5`, `claude-haiku-4.5`

**Configuration:**

* **API Key** (required): Your Anthropic API Key

***

#### Google AI Studio

Connect to Google's AI Studio API.

**Supported models**: `gemini-3-pro`, `gemini-3-flash`, `gemini-2.5-pro`, `gemini-2.5-flash`

**Configuration:**

* **API Key** (required): Your Google AI Studio API Key

***

#### Google Vertex AI

Connect to Gemini and Claude models via Google Cloud Platform.

**Supported models**: `gemini-3-pro`, `gemini-3-flash`, `gemini-2.5-pro`, `gemini-2.5-flash`, `claude-opus-4.6`, `claude-sonnet-4.6`,`claude-opus-4.5`, `claude-sonnet-4.5`

**Configuration:**

* **Project ID** (required): Your GCP project ID
* **Location** (required): GCP region (e.g., us-central-1)
* **Service Account Credentials** (required): JSON credentials for a service account with Vertex AI access.

***

#### AWS Bedrock

Access Claude models through AWS infrastructure.

**Supported models**: `claude-opus-4.6`, `claude-sonnet-4.6`,`claude-opus-4.5`, `claude-sonnet-4.5`, `claude-haiku-4.5`

**Configuration:**

* **AWS Access Key ID** (required)
* **AWS Secret Access Key** (required)
* **AWS Region** (required): e.g., us-east-1

***

#### Azure OpenAI

Access OpenAI models through Azure infrastructure.

**Supported models**: `gpt-5.2`, `gpt-5`, `gpt-5-mini`, `gpt-4o`, `gpt-4o-mini`

**Configuration**:

* **API Base** (required): Your Azure OpenAI endpoint URL
* **API Key** (required): Your Azure OpenAI API key
* **API Key Version** (required): API version (e.g., 2025-01-01-preview)

***

#### Snowflake Cortex

Access AI models directly within your Snowflake environment

**Supported models**: `claude-opus-4.6`, `claude-sonnet-4.6`,`claude-opus-4.5`, `claude-sonnet-4.5`, `claude-haiku-4.5`, `openai-gpt-5`, `openai-gpt-5-mini`

**Configuration:**

* **API Base** (required): Your Snowflake Cortex endpoint (e.g., `https://myorg-myaccount.snowflakecomputing.com/api/v2/cortex/inference:complete`)
* **API Keys** (required): Personal Access Token (PAT)

***

#### Databricks Model Serving

Access models through Databricks serving endpoints.

**Supported models:** `databricks-claude-opus-4.6`, `databricks-claude-sonnet-4.6`,`databricks-claude-opus-4.5`, `databricks-claude-sonnet-4.5`, `databricks-claude-haiku-4.5`, `databricks-gemini-3-pro`, `databricks-gemini-3-flash`, `databricks-gemini-2.5-pro`, `databricks-gemini-2.5-flash`, `databricks-gpt-5.2`, `databricks-gpt-5`, `databricks-gpt-5-mini`

**Configuration**:

* **API Base** (required): Your Databricks serving endpoint URL.
* **API Key** (required): Databricks Personal Access Token

***

#### Oracle Generative AI

Access models through Oracle Cloud.

**Supported models**: `google.gemini-2.5-pro`, `google.gemini-2.5-flash`, `xai.grok-4`

**Configuration:**

* **OCI User OCID** (required)
* **OCI Tenancy OCID** (required)
* **OCI Fingerprint** (required)
* **OCI Private Key** (required)
* **OCI Region** (required): e.g., us-ashburn-1

{% hint style="info" %}
When you **configure a custom provider, it replaces the CARTO-managed models**. Only the models from your configured providers will be available when creating AI Agents.
{% endhint %}

{% hint style="info" %}
**Interested in other providers or models?** We're continuously expanding our AI provider support. If you'd like to see a specific provider or model added, pleas share your feedback with us.
{% endhint %}

## Managing Models

From the CARTO AI settings page, you can:

* **Add providers:** Configure credentials for a supported provider to make its models available to the organization.
* **Remove providers:** Disable a provider to remove its models from availability. Any models from that provider will no longer be selectable for new agents or AI features.
* **Set default model**: Choose which model to use by default for all CARTO AI features.
* **View available models:** See the full list of models currently available to the organization across all configured providers.

## AI Features

Currently, the following AI Features are available in the CARTO platform:

* [**AI Agents**](https://docs.carto.com/carto-user-manual/ai-agents): Enable Editor users to create conversational agents that analyze and interact with maps, using available CARTO tools and MCP tools to provide insights directly in the interface.

<figure><img src="https://3029946802-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FybPdpmLltPkzGFvz7m8A%2Fuploads%2Fgit-blob-454995b435042718a5e81d04b276d0fb3cca52d4%2FScreenshot%202025-10-04%20at%2009.35.54.png?alt=media" alt=""><figcaption></figcaption></figure>


---

# Agent Instructions: Querying This Documentation

If you need additional information that is not directly available in this page, you can query the documentation dynamically by asking a question.

Perform an HTTP GET request on the current page URL with the `ask` query parameter:

```
GET https://docs.carto.com/carto-user-manual/settings/carto-ai.md?ask=<question>
```

The question should be specific, self-contained, and written in natural language.
The response will contain a direct answer to the question and relevant excerpts and sources from the documentation.

Use this mechanism when the answer is not explicitly present in the current page, you need clarification or additional context, or you want to retrieve related documentation sections.
