How to update CSV header

Hi!

I am currently working on a job to get data from a Redshift view and upload it to a SFTP server. I used the following steps to make it:

 

1) Create a table or view with the query result in your Snowflake DB

2) Use S3 unload component to create a CSV file in your S3 bucket

3) Use the Data Transfer Component to upload the file from S3 to your target SFTP server

 

My issue is that I need the header of the CSV file to be uppercase, but it's always lowercase. Could you please advise on a solution for this? Is there any component that I can use to achieve this?

Thanks in advance!

Hi @ana.queiroz​,

I believe I am following you but feel free to correct me. You are trying to export data from a Redshift view in CSV format and drop it on an SFTP. If this is correct, what I can tell you is that Matillion is not exporting the data from the view with a lower case header row. It's exporting it exactly as it is in Redshift. So, I would start by making sure the view is defining all the columns names in upper case. If they are already in upper case then it could be a by product of Redshift. We are a Snowflake shop and I know from experience that the unload component unloads the data exactly as it is in the source view or table in Snowflake. Redshift could work different in this manner.

Let us know if this helps at all. There are probably other options like staging the file in a location, manipulating the file then pushing it to SFTP.

Hi - funny timing on this question as I have exactly the same use case / scenario as ana.queiroz.

 

And the answer from Bryan is spot on.

 

So my question goes back one level. How do we upper-case column names in Redshift?

 

It seems that the Create Table component does not transfer an upper-case naming convention from a column name into Redshift, itself. And there is no component Options to modify the PRESERVE_COL_NAMES= option.

 

Is a SQL component required to make this happen?

 

Thanks!

Hi.

Redshift database objects (i.e. databases, schemas, tables and columns) are all stored in lowercase in the metadata behind the scenes. When you use the S3 Unload Component in Matillion it's actually Redshift that does the work to unload the data. Unfortunately there is no way around this.

 

A solution that we have used previously is to unload the data to S3 in a .csv format, then use some Python to insert the correctly formatted column headers into the top of the file.

 

Hope this helps.