Hi Team,
I have developed an API Query component to ingest data from Oracle CPQ. However, the API response is a fairly large JSON payload, and the ingestion process is running very slowly and even timing out. My understanding is that the API Query Profile is spending significant time parsing and flattening the JSON before inserting it into Snowflake.
As a comparison, I tested the same API call using a Python component in Matillion, where I queried the API, parsed the JSON, and inserted it into Snowflake. This approach was noticeably faster than the API Query Profile.
My question is: Is there a way to configure the API Query component to simply retrieve the API response and ingest the raw JSON directly into a VARIANT column in Snowflake, without flattening?
I was advised to look into the API Extract component, but it does not support “Advanced Mode,” so I’m unable to apply the additional configuration needed.
Thanks,
Srinadhan
Hello @srinadhan_samudrala
Thank you for your post. I will take this away and look into it for you.
Kind regards Joe
Hello @srinadhan_samudrala
Thank you for your patience whilst I went away and looked into this.
You’re hitting a common default behaviour: API queries often default to VARCHAR this because they can’t always predict the structure of the incoming data. If you want more control, specifically to use a VARIANT type for JSON, you’ll need to handle the table schema manually.
Here is the standard workflow to fix this:
1. Capture the Full Payload
Instead of mapping individual fields in your API profile, create a profile with a single column. You’ll likely need to use a special attribute (often called Aggregate) to ensure you are capturing the entire JSON blob rather than just a specific root value.
2. Manage the Column Size
By default, these columns often cap out at VARCHAR(2000). If your JSON payload is beefy, you’ll need to override this in your connection options:
-
Parameter: defaultcolumnsize
-
Value: Set this to a significantly larger number (e.g., 16777216 for Snowflake) to prevent truncation before the data hits your table.
3. Manual Table Creation
To get that VARIANT data type:
-
Create the target table manually in your database with the column defined as VARIANT (or the equivalent JSON type for your specific warehouse).
-
Disable “Create Target Table” within the component settings. This forces the component to map the incoming data to your existing, correctly-typed table rather than trying to build a new VARCHAR one on the fly.
I hope this helps. Please let me know if you need anything else and do keep me posted.
Kind regards, Joe
Hello , thank you very much for detailed set, here below is my API Query profile,towards setting up of the column I tried below option and just got blank, do you have sample RSD file to refer?
Issue is resolved now, I was able to user suggestion along with other posted information
1 Like
That is awesome @srinadhan_samudrala
Thank you for letting us know.
Kind regards, Joe