Quick Start: Snowflake¶
Extract data from MSSQL, load it into Snowflake, and generate dbt models -- in under 5 minutes.
Prerequisites¶
skippr-dbtandskippron PATH (Install)- Python venv with
dbt-coreanddbt-snowflakeinstalled - Environment variables set:
export LLM_API_KEY="sk-..."
export SNOWFLAKE_ACCOUNT="MYORG-MYACCOUNT"
export SNOWFLAKE_USER="myuser"
export SNOWFLAKE_PRIVATE_KEY_PATH="/path/to/snowflake_key.p8"
export MSSQL_CONNECTION_STRING="server=tcp:127.0.0.1,1433;database=testdb;user id=sa;password=YourPass;TrustServerCertificate=true"
See Connect: Snowflake for key-pair setup and Connect: MSSQL for connection string details.
1. Initialise the project¶
mkdir my-workspace && cd my-workspace
skippr-dbt init mssql-migration
2. Connect the warehouse¶
skippr-dbt connect warehouse snowflake \
--database ANALYTICS \
--schema RAW \
--warehouse COMPUTE_WH \
--role ACCOUNTADMIN
3. Connect the source¶
skippr-dbt connect source mssql \
--connection-string '${MSSQL_CONNECTION_STRING}'
4. Check prerequisites¶
skippr-dbt doctor
5. Run the pipeline¶
skippr-dbt run
The pipeline will:
- Discover source schemas from MSSQL.
- Sync data into Snowflake bronze tables.
- Verify the destination tables are queryable.
- Plan a silver (staging) layer with one model per raw table.
- Author dbt SQL models with type casting and column mapping.
- Validate by running
dbt compileanddbt runagainst the warehouse.
6. Verify outputs¶
Generated config¶
# skippr-dbt.yaml
project: mssql_migration
warehouse:
kind: snowflake
database: ANALYTICS
schema: RAW
warehouse: COMPUTE_WH
role: ACCOUNTADMIN
source:
kind: mssql
connection_string: ${MSSQL_CONNECTION_STRING}
dbt models¶
models/
├── schema.yml # source definitions
└── staging/
├── stg_raw_customers.sql # silver model
└── stg_raw_orders.sql # silver model
Snowflake schemas¶
| Schema | Contents |
|---|---|
ANALYTICS.RAW |
Bronze -- raw MSSQL data |
ANALYTICS.MSSQL_MIGRATION_SILVER |
Silver -- staged and cleansed |
ANALYTICS.MSSQL_MIGRATION_GOLD |
Gold -- mart-ready models |