Hi All, I need to extract 75 TB (uncompressed) data from Oracle Exadata 19c (on-premise) and load into BigQuery. Please could you suggest the best way to unload this data from Exadata?

Hi All,

I need to extract 75 TB (uncompressed) data from Oracle Exadata 19c (on-premise) and load into BigQuery. Please could you suggest the best way to unload this data from Exadata?

Hi @ag00352599​,

The answer to your question depends on a lot of factors. If this is just a one time load then you have several options. If this is a load and then incrementally update the tables in Snowflake periodically then that changes a few things and makes it potentially a much harder implementation.

One option would be to use Matillion and load blocks of tables depending on their size but this will likely take days to finish on a 75 tb database. There are other products out there that will move this volume of data. I can't speak for Google but in AWS they have DMS (Database Migration Service) which is geared to do this type of heavy lift. If there is an equivalent in Google, that might something to investigate. If you want to private message me, I can give you a couple syncing products that we use. These would be paid products but could potentially be a huge help.