Incremental load fails for a table when a new column is added in the source table. What can be done to solve this issue?

We have an incremental load failing for some tables when a new column is added in the source tables. We use 'Table Update' component for the incremental load, and if I am wrong, the loads fails because the component does not recognise the new column. What could be done to solve this issue?

 

Sorry, Can you elaborate the issue more?

Hii NeelamMacwan,

 

did you got the solutions, i am also facing the same issue

please let me know

Hi rathod9127 (Customer),

 

Sorry for late reply.

No, this is a bug with Matillion and everything I tried did not work out. Our pipeline still fails time to time.

 

Neelam

Hi Any solution to this issue? We get this once in a while and sometimes works fine on re run. We are connecting to Redshift .

Not sure if its specific to redshift. Do you @NeelamMacwan@rathod9127​ connect to redshift as well? was your issue resolved??

Hi @Preethi​ ,

The issue is not resolved for us yet.

However, you can use JDBC Incremental connector if it is available for Redshift(I use Snowflake), this accommodates schema drift, so any change in source schema will be mirrored in the target table also.

This is the link: https://docs.matillion.com/metl/docs/2955330/

If not, you can create a custom pipeline to check for any changes in the schema at source and then apply those to target before every load.

Hope this helps.

Neelam

Hi Ganesh,

 

To put in simple words, whenever there is a new column added in the source table, the incremental run does not load data in the target table because of mismatch in columns at source & target.

I have to delete the target table and do a full refresh. Is there anyway I can avoid this when a new column is added at the source.

For example,

The source.table_a has 1,2,3 columns and target.table_a has same 1,2,3 columns. I need to initiate full load for table_a when a new column 4 is added in the source.table_a which I do not want to do.