Hello Team, I have requirement to create a pipeline which ingests data(Batch/cdc) from Salesforce to Snowflake with S3 as a staging area. The pipeline should be dynamically able to create folders in S3 and store files for each load(batch/cdc)

Question:

  1. Which is the best option to implement this use-case? Matillion Data Loader /ETL ?
  2. If Matillion ETL is leveraged, is there an option to do batch load of all the tables in a single job?
  3. How to dynamically create folders and store objects in S3 post every load

Hi @sreevidhyas

Pardon the delayed response and to answer your questions:

1. Both Matillion Data Loader and ETL could do that

2. Yes, using variables (see: https://documentation.matillion.com/docs/2968378#using-variables)

3. Use a variable in the "stage" name of the Salesforce Query component (see: https://documentation.matillion.com/docs/2389995)

Thank you,

Chika