How to send a csv file from Snowflake to S3?

Hello!

 

Just wanted to ask if it's possible for Matillion to output a csv file from Snowflake then send it to AWS S3?

 

Regards,

Joey

Hi Joey, I think you are looking for the "S3 Unload Component": https://documentation.matillion.com/docs/2167231

How can this be done when the instance is hosted on Azure?

Hi JChri,

 

For Azure instances there is Azure Blob Storage Unload, which unloads a table into files in a storage account. The Matillion has a Data Transfer Component which can copy a file from a storage account into S3.

 

The docs don't say how you are supposed to authenticate into AWS but I guess it's by setting AWS credentials in your Environment.

 

Schepo

My Matillion instance is on Azure with snowflake as the database, but I want to load data into an S3 Bucket. I wanted to load directly into S3 with a copy into command, without having to load into a blob storage first.

I have been able to do it, the only question I have, really, is how I can do this without specifying the AWS credentials in plaintext in the copy into command. I've written the command in the a regular SQL-component in an orchestration job.