Two different components in a same pipeline are using different service accounts

Hello All,

I am facing some weird kind of error in my pipeline. I am using bash scripting component along with python script and other data load, data transfer components. All the components are using the service account which i have used while configuring my environment. However, the bash script is using some random service account.

Error occurred in bash script: AccessDeniedException: 403 matillion-dev-sa@blah.iam.gserviceaccount.com does not have storage.objects.list access to the Google Cloud Storage bucket.

Error occurred in data transfer component :

403 Forbidden

...

"error": {

"code": 403,

"message": "abc-matillion-sa@blah.iam.gserviceaccount.com does not have storage.objects.delete access to the Google Cloud Storage object.",

...

The service account currently configured in environment is abc-matillion-sa.

Has anybody faced this issue anytime? Any idea whats going wrong?

Hi @User1641361755193676745​,

I am completely guessing here but I am wondering if for some reason Bash scripts default to use the service account tied to the Matillion instance itself? If so, you could give that service account the appropriate access to the Storage bucket in question. I typically do everything in Python (which is an option) as it's much more flexible and portable later but that's just my preference. This could be why I haven't seen this issue before.

Another option maybe to set the credentials explicitly before the call is made and see if that fixes it. I'm not saying you should have to do this but it would confirm the issue. If this approach does fix the problem, it may be worth reaching out to support and confirming that it's supposed to work that way. If they confirm, I would post an idea on the ideas portal that speaks to having the credentials in the environment tied to and set before any bash scripts run.

I hope this helps!