Hello community!
Our data team is both getting spun up on Matillion ETL (loving it so far!) and are moving from BigQuery to Redshift to align with the tech stack of our parent organization. As a part of this effort we're trying to use Matillion ETL for Redshift's Google BigQuery component to load historical data from BigQuery over to Redshift.
A blocker in this strategy is the Google BigQuery component doesn't let us specify the data types of columns in the target table. String columns in BigQuery always come over to the target table as a varchar(2000), however in BigQuery many column values in these columns exceed 10000 characters, so the data are truncated.
Our next thought is to attempt to bring the data into S3 and reference via an External table, but before we try that is there other advice on what we can try in Matillion to easy grab data from BigQuery that exceeds 2000 characters over to Redshift?
Thank you in advance for your help,
Bryan