I have an unusual problem. 270 million files in one s3 bucket. I can't read or list the files with matillion or even from snowflake. The apps just hang and never returned. I need to move about 4 million of the files (very random names stored in a snowflake table) into a different S3 bucket. Everything is working well except the data transfer stops after 5,000 files. Has anyone found a way around this?
Related topics
Topic | Replies | Views | Activity | |
---|---|---|---|---|
I am trying to set up a process of sharing the data from Snowflake to SFTP server on a regular basis using Matillion | 2 | 0 | December 23, 2021 | |
Hi Am new to matillion,am tying to load almost 5 million Json files to snow flake using s3 load. Its giving an error 'Total size (>=1,073,742,040 bytes) for the list of file descriptors returned from the stage exceeded limit (1,073,741,824 bytes); Number' | 4 | 0 | February 18, 2022 | |
Push data from Excel files stored on Box to Snowflake through Matillion | 1 | 0 | February 8, 2023 | |
File Transfer Windows network folder into S3 | 0 | 2 | October 27, 2021 | |
Which component i can use | 3 | 1 | March 7, 2023 |