Skip to content

Quick Start: Snowflake

Five commands to go from a SQL Server database to materialised dbt models in Snowflake -- bronze, silver, and gold layers, all generated and validated automatically.

Prerequisites

  • skippr on PATH (Install — includes Windows PowerShell one-liner)
  • Python venv with dbt-core and dbt-snowflake
  • OpenSSL installed for key-pair auth (pre-installed on macOS/Linux; Windows: winget install OpenSSL)
  • Authenticated via skippr user login (or SKIPPR_API_KEY for CI)
  • Snowflake and MSSQL credentials in your environment:
bash
export SNOWFLAKE_ACCOUNT="MYORG-MYACCOUNT"
export SNOWFLAKE_USER="myuser"
export SNOWFLAKE_PRIVATE_KEY_PATH="/path/to/snowflake_key.p8"
export MSSQL_CONNECTION_STRING="server=tcp:127.0.0.1,1433;database=testdb;user id=sa;password=YourPass;TrustServerCertificate=true"

Need help with credentials? See the Snowflake connector guide for auth setup (including service accounts) and MSSQL for connection strings.

Build the pipeline

bash
# 1. Create the project
mkdir my-workspace && cd my-workspace
skippr init mssql-migration

# 2. Point at your warehouse
skippr connect warehouse snowflake \
  --database ANALYTICS \
  --schema RAW \
  --warehouse COMPUTE_WH \
  --role ACCOUNTADMIN

# 3. Point at your source
skippr connect source mssql \
  --connection-string '${MSSQL_CONNECTION_STRING}'

# 4. Verify everything is wired up
skippr doctor

# 5. Run it
skippr run

That's it. skippr run discovers your source schemas, extracts the data, loads it into Snowflake, and generates a complete dbt project with silver and gold models -- compiled and materialised.

What you get

dbt models (ready to extend)

models/
├── schema.yml                   # source definitions
└── staging/
    ├── stg_raw_customers.sql    # silver model
    └── stg_raw_orders.sql       # silver model

Snowflake schemas (populated and queryable)

SchemaContents
ANALYTICS.RAWBronze -- raw extracted data
ANALYTICS.MSSQL_MIGRATION_SILVERSilver -- staged and cleansed
ANALYTICS.MSSQL_MIGRATION_GOLDGold -- mart-ready models

Project config

yaml
# skippr.yaml
project: mssql_migration

warehouse:
  kind: snowflake
  database: ANALYTICS
  schema: RAW
  warehouse: COMPUTE_WH
  role: ACCOUNTADMIN

source:
  kind: mssql
  connection_string: ${MSSQL_CONNECTION_STRING}

What this quickstart proves

  • The runner reads MSSQL data and writes it directly into Snowflake.
  • Skippr generates a reviewable dbt project instead of hiding the result behind a proprietary format.
  • Authentication and control-plane services are cloud-backed, but row-level source data is not routed through that cloud path.
  • The next trust layer is How It Works and CDC Guarantees.

What's next

  • Run skippr run again -- it's incremental, only new and changed rows are synced.
  • The dbt project is yours. Add tests, snapshots, or custom gold models.
  • See How It Works for the full pipeline breakdown.