Connect: BigQuery¶
Command¶
skippr-dbt connect warehouse bigquery \
--project my-gcp-project \
--dataset raw_data \
--location US
Or run without flags to be prompted interactively.
Flags¶
| Flag | Description |
|---|---|
--project |
GCP project ID |
--dataset |
BigQuery dataset |
--location |
Dataset location (e.g. US, EU) |
Config output¶
Running connect warehouse bigquery writes the following to skippr-dbt.yaml:
warehouse:
kind: bigquery
project: my-gcp-project
dataset: raw_data
location: US
Authentication¶
Authentication uses a GCP service account key file.
| Variable | Description |
|---|---|
GOOGLE_APPLICATION_CREDENTIALS |
Path to a GCP service account JSON key file |
Setting up a service account¶
- In the GCP Console, go to IAM & Admin > Service Accounts.
- Create a service account with the BigQuery Data Editor and BigQuery Job User roles.
- Create a JSON key and download it.
- Set the environment variable:
export GOOGLE_APPLICATION_CREDENTIALS="/path/to/service-account.json"
Required permissions¶
The service account needs:
bigquery.datasets.create(for silver/gold dataset creation)bigquery.tables.create,bigquery.tables.updateData(for loading data)bigquery.jobs.create(for running queries)
The BigQuery Data Editor and BigQuery Job User roles cover these.
Troubleshooting¶
| Symptom | Fix |
|---|---|
Could not automatically determine credentials |
Verify GOOGLE_APPLICATION_CREDENTIALS points to a valid JSON key file |
Access Denied: Dataset |
Check the service account has the required roles on the project |
Not found: Dataset |
The dataset will be created automatically; ensure the service account has bigquery.datasets.create |