Exercise 5.1 โ€” Matillion Hub Trial + First Job

Stand up a Matillion Hub trial, connect it to your Snowflake trial, and build a transformation that reads meter_reads and writes hourly_consumption_by_zip. First hands-on feel of push-down ELT.

Prereq: Snowflake trial (Ex 2.1), meter_reads table exists. Time: 45 min. Cost: ~$0.02 Snowflake credits. Matillion trial is free.

Step 1 โ€” Sign up for Matillion Hub

Step 2 โ€” Connect Matillion to Snowflake

Step 3 โ€” Build a transformation

Your goal: given the synthetic meter_reads from Exercise 2.1, produce an hourly_consumption_by_zip summary.

Step 4 โ€” See the SQL Matillion generated

This is the magic moment. In the job detail panel, find the "SQL" or "Generated SQL" view. Matillion compiled your 4-node graph into a single statement like:

CREATE OR REPLACE TABLE LEARN_DB.UTILITY.HOURLY_CONSUMPTION_BY_ZIP AS
SELECT
  premise_zip,
  DATE_TRUNC('hour', read_ts) AS read_hour,
  SUM(consumption_kwh) AS hourly_kwh,
  COUNT(*) AS reads_in_hour
FROM LEARN_DB.UTILITY.METER_READS
GROUP BY premise_zip, DATE_TRUNC('hour', read_ts);

The entire transformation โ€” 4 visual nodes โ€” becomes one SQL statement. Snowflake executes it. Matillion did zero compute.

Step 5 โ€” Wrap in an Orchestration Job

That's the full Matillion pattern: Orchestration wraps Transformation wraps SQL.

Step 6 โ€” Verify in Snowflake & cleanup

SELECT * FROM LEARN_DB.UTILITY.HOURLY_CONSUMPTION_BY_ZIP LIMIT 20;
SELECT COUNT(*) FROM LEARN_DB.UTILITY.HOURLY_CONSUMPTION_BY_ZIP;

Expect ~7,200 rows (300 zips ร— 24 hours ร— 30 days / bucket). Drop when done:

DROP TABLE IF EXISTS LEARN_DB.UTILITY.HOURLY_CONSUMPTION_BY_ZIP;

Interview talking point โ€” save to notepad

"In Matillion I authored the transformation as a visual graph, but the generated SQL is a single CREATE OR REPLACE TABLE statement pushed down to Snowflake. So Matillion orchestrates, Snowflake computes โ€” which means performance tuning happens at the warehouse-sizing level, not the Matillion level."