I have an orchestration job that transfers files from a windows fileshare to Azure blob storage. which fileshare and which files (regex filter) is passed in via variables.
The job is called from various other jobs (each project related and independant of eachother).
The transfer job will on each run generate a unique foldername and then store the transferred files in that subfolder on Azure.
I need a way to pass that unique foldername back to the calling job so it can go and fetch the files that were transferred for it and process them into Snowflake.
Is there a way to pass variables back up to the caller?
I have tried with public and shared, but that only seems to work within the job itself.
An environment variable is not reliable because jobs may run in parallel and the env var would get overwritten by a different run of the same job.
I would rather not use a db table as a communication channel, as it will get filled with passed run information over time.