Exercise 6.1 โ Write a schemachange Migration + Jenkinsfile Skeleton
Given your GitOps background, this is a pattern-match exercise. Build a small migration repo, run schemachange locally, then sketch the Jenkinsfile. 30 minutes.
Step 1 โ Install schemachange locally
pip install schemachange
schemachange --help
Step 2 โ Create the repo layout
mkdir -p ude-snowflake-migrations/migrations
cd ude-snowflake-migrations
Create migrations/V1.0.0__initial_schema.sql:
USE DATABASE LEARN_DB;
CREATE SCHEMA IF NOT EXISTS CI_DEMO;
USE SCHEMA CI_DEMO;
CREATE TABLE IF NOT EXISTS meters (
meter_id STRING PRIMARY KEY,
premise_zip STRING,
installed_at TIMESTAMP_NTZ
);
Create migrations/V1.0.1__add_meter_type.sql:
USE SCHEMA LEARN_DB.CI_DEMO;
ALTER TABLE meters ADD COLUMN IF NOT EXISTS meter_type STRING DEFAULT 'AMI';
Create migrations/R__meter_quality_view.sql (R_ = repeatable, re-run on every apply):
USE SCHEMA LEARN_DB.CI_DEMO;
CREATE OR REPLACE VIEW meters_by_zip AS
SELECT premise_zip, COUNT(*) AS n_meters FROM meters GROUP BY premise_zip;
Step 3 โ Run schemachange against your trial
export SF_ACCOUNT="xy12345.us-west-2.aws"
export SF_USER="..."
export SF_PASSWORD="..."
schemachange \
-f ./migrations \
-a $SF_ACCOUNT \
-u $SF_USER \
-r ACCOUNTADMIN \
-w LEARN_WH \
-d LEARN_DB \
--create-change-history-table
First run should apply 3 scripts and create SCHEMACHANGE.CHANGE_HISTORY. Run again โ should say "no migrations to apply" for versioned, "applied" for repeatable. That's idempotency.
Step 4 โ Sketch the Jenkinsfile
Create Jenkinsfile at repo root:
pipeline {
agent any
environment {
SF_ACCOUNT = credentials('snowflake-account')
SF_USER = credentials('snowflake-user')
SF_PASSWORD = credentials('snowflake-password')
}
stages {
stage('Lint') {
steps { sh 'sqlfluff lint migrations/ --dialect snowflake' }
}
stage('Dry run (PR)') {
when { changeRequest() }
steps {
// Create a PR-scoped DB clone, apply migrations there
sh '''
snowsql -a $SF_ACCOUNT -u $SF_USER -q "
CREATE OR REPLACE DATABASE CI_$CHANGE_ID CLONE LEARN_DB;
"
schemachange -f ./migrations -a $SF_ACCOUNT -u $SF_USER -d CI_$CHANGE_ID -w LEARN_WH -r ACCOUNTADMIN
'''
}
post {
always {
sh 'snowsql -a $SF_ACCOUNT -u $SF_USER -q "DROP DATABASE IF EXISTS CI_$CHANGE_ID;"'
}
}
}
stage('Deploy to prod') {
when { branch 'main' }
steps {
sh 'schemachange -f ./migrations -a $SF_ACCOUNT -u $SF_USER -d LEARN_DB -w LEARN_WH -r ACCOUNTADMIN'
}
}
}
post {
failure { slackSend channel: '#data-engineering', message: "Snowflake migration failed: ${env.JOB_NAME} #${env.BUILD_NUMBER}" }
}
}
Step 5 โ Step back and notice
This Jenkinsfile is 95% the shape of a generic Jenkinsfile. The DE-specific parts are:
- Stage "Dry run (PR)" uses Snowflake's
CLONEto get a throwaway preview env. - Lint via
sqlfluff(standard for SQL projects). - No container build โ the "artifact" is versioned SQL.
Write out what you'd extend with next (dbt test stage? Matillion deploy via API?) in your notepad.
Cleanup
DROP SCHEMA IF EXISTS LEARN_DB.CI_DEMO CASCADE;
DROP SCHEMA IF EXISTS LEARN_DB.SCHEMACHANGE CASCADE;