Hi all,
I'm using Matillion ETL for Snowflake to connect to a relational database source to ingest and transform data into Snowflake.
Let's say I have 5 hardcoded ingest jobs that handle peculiarities of each job:
- Job A
- Job B
- Job C
- Job D
- Job E
For each of these ingest jobs, I can determine common attributes, such as:
- Number of delta records ingested
- Earliest modified datetime of the records captured in the delta
- Most recent modified datetime of the records captured in the delta
As I can generate these common attributes within each of the 5 jobs, I'd like to create one orchestration/transaction job to include/execute from within each of the 5 (as opposed to hardcoding the logging within each separately).
Rather than hardcode the name of caller job (ie. one of the 5), it'd be good if I can determine this at runtime and pass it into the include (ie. set scalar variable or grid).
Whilst I haven't got to doing this in a separate job file, I've successfully used an "SQL Script" within an orchestration job, to insert variables such as:
- ${job_id}
- ${component_id}
But their values are "-1" .. not sure why.
But if I try these variables, it fails:
- ${job_name}
- ${component_name}
So, is it possible to access these variables, what are their expected values and is there documentation that lists what is available.
Unfortunately, doing a search for anything with "job" (eg. "Job Name variables") just returns documentation about job scope variables.
Next step would be to try and move away from an "SQL Script" component embedded within each of the 5 jobs and get it into a separate file but I could live with that if it were possible.
Hoping someone has done something similar in native Matillion as opposed to Bash/Python.
Cheers,
Mick