Skip to main content

Documentation Index

Fetch the complete documentation index at: https://asdfasdf-c9efe8d6.mintlify.app/llms.txt

Use this file to discover all available pages before exploring further.

Data warehouse connectors move data from your workspace into your warehouse for long-term storage, advanced analytics, and joining with other datasets.

Supported warehouses

  • Snowflake
  • Google BigQuery
  • Amazon Redshift
  • Databricks (Delta Lake)
  • Azure Synapse Analytics

Connection setup

1

Create a destination in the warehouse

In your warehouse, create a dedicated database and schema for the platform’s data. Create a service account or user with write access to that schema. Avoid using root or admin credentials.
2

Add the warehouse connection

Go to Integrations → Catalog → Data warehouses, select your warehouse type, and enter the connection details:
  • Host / account identifier (Snowflake uses account identifiers; others use hostnames)
  • Database and schema
  • Username and password or service account credentials
3

Configure table settings

Choose which entities to sync to the warehouse (projects, reports, analytics events, audit logs). Each entity maps to one table in the destination schema. Table names are pre-set and cannot be changed.
4

Test and activate

Click Test connection to validate credentials and write access. Once the test passes, click Activate to start syncing.

Table configuration

Each synced entity creates one table in the destination schema. Tables use an append-only or upsert write strategy depending on the entity type:
EntityWrite strategy
Events and audit logsAppend-only
Projects, users, reportsUpsert (update on primary key match)
Columns follow a predictable naming convention: snake_case, with timestamps as UTC ISO 8601 strings.
Warehouse syncs run on a scheduled basis (minimum 1-hour intervals). Real-time streaming to warehouses is not currently supported.