S3 bucket/files for training not public?

Hello... very quick question...

While following the training steps provided in this page LINK .. I'm not able to read the S3 bucket where the files I need are.

 

The bucket is not accessible but according to the description of the task it should be:

You will be reading a compressed CSV file located in a public S3 bucket located in our AWS account.

 

This is the s3 bucket: s3://online-mtln-training-s3-flights/

Did some of you had the same issue? If so and this is a problem on matillion side how can I ask them to check?

 

Thanks for now

F.

 

PS just in case..... yes, I've noticed that the S3 bucket in the documentation and in their screenshot is different (in the screenshot they use s3://mtln-training-s3-flights/ ). Also this one is not public.

 

Hi @frenke​,

I was wondering if the root cause is the S3 privilege you have on your Matillion ETL instance.

If you press the Test button in your Environment, to verify the cloud API privileges, what do you see?

Something like this is good...

0694G00000FXCbqQAH_0D74G000007jSdiSAE

Whereas something like this is bad...

0694G00000FXCc5QAH_0D74G000007jSdiSAE

If you are seeing a "check credentials" warning, I would guess you are missing some of the S3 privileges. We have this document on various ways to set up IAM Roles & Permissions on AWS, which should help you add them.

Ian

Hi Ian, nice to meet you and thanks for the answer.

 

I have indeed the warning for S3. That is strange because I have full access (allow all actions to a dedicated bucket).

I played a little but with permissions and while there's still something I don't understand or I don't like I was able to have the S3 API:success 🙌 ....... and more specifically read the external bucket for the training.

 

So you are right, that was the root cause. 🍾

 

Thanks again for the help!

F

Hello @Bryan​ , nice to meet you and thanks for your reply

I agree with what you say, with roles we have so much more control and understanding, thanks!

Just to add some more information to the initial problem (maybe also @ian.funnell​ could be interested?) ... I already had full permissions on my bucket.

The problem that I found is that the initial policy was something like ALLOW ALL ACTIONS ON BUCKET XYZ AND OBJECTS XYZ/*

there are indeed actions/permissions that are generic and they don't target buckets/objects, one of these permissions is the "ListAllMyBuckets" that needs to have the ALLOW on ALL S3 resources.

Our policy was generated with a script so we thought allowing ALL on everything in that bucket was enough but it is not, we need to grant "ListAllMyBuckets" to the overall S3 service.

As stated in the docs this is used for exploration and discovery so it is not strictly needed, but I wanted to see the "S3 API:success" so for the moment I'm ok with that.

lesson learnt in this case, remember what I already knew that some permissions needs to be applied on the overall service, not specific objects/buckets.

Thanks again for the help and support, I'm back in business for the moment.

FL

I have same issue but with different message, it says that the S3 bucket does not exists

instead of that I can see other folders but not the one with files from Matillion Academy training

As a follow-up @frenke​,

Check that your IAM role that Matillion is using has permissions to that particular S3 bucket. The proper way to control permissions in AWS is through IAM roles and not the services or objects themselves. I have seen where people mix IAM role permissions with service/object (in your case S3 bucket) permissions. Although you can do this, it becomes extremely hard to understand what the role and/or users in that role have permissions to.

I am having a similiar issue trying to read the Airport avro files in AWS S3.

Unable to load credentials from any of the providers in the chain AwsCredentialsProviderChain(credentialsProviders=[SystemPropertyCredentialsProvider(), EnvironmentVariableCredentialsProvider(), WebIdentityTokenCredentialsProvider(), ProfileCredentialsProvider(profileName=default, profileFile=ProfileFile(profilesAndSectionsMap=[])), ContainerCredentialsProvider(), InstanceProfileCredentialsProvider()]) : [SystemPropertyCredentialsProvider(): Unable to load credentials from system settings. Access key must be specified either via environment variable (AWS_ACCESS_KEY_ID) or system property (aws.accessKeyId)., EnvironmentVariableCredentialsProvider(): Unable to load credentials from system settings. Access key must be specified either via environment variable (AWS_ACCESS_KEY_ID) or system property (aws.accessKeyId)., WebIdentityTokenCredentialsProvider(): Either the environment variable AWS_WEB_IDENTITY_TOKEN_FILE or the javaproperty aws.webIdentityTokenFile must be set., ProfileCredentialsProvider(profileName=default, profileFile=ProfileFile(profilesAndSectionsMap=[])): Profile file contained no credentials for profile 'default': ProfileFile(profilesAndSectionsMap=[]), ContainerCredentialsProvider(): Cannot fetch credentials from container - neither AWS_CONTAINER_CREDENTIALS_FULL_URI or AWS_CONTAINER_CREDENTIALS_RELATIVE_URI environment variables are set., InstanceProfileCredentialsProvider(): Failed to load credentials from IMDS.]

 

The AWS connection test for S3 is successful, the test does say 'Check Credentials' for DMS (dont know if this is relevant.

 

Any help would be appreciated.

I am currently having this same issue - getting Status Code: 404; Error Code: NoSuchBucket;

Has anyone responded with a fix?

Resolved this issue. The environment settings for AWS by default is 'Instance'. I had setup an AWS IAM user and group. You can add and test this user in the AWS user wizard, but it did not switch over the actual setting. Once I noticed the setting was still on 'Instance' I changed it to the AWS IAM user and the S3 connection to the sample data worked.

I resolved my self, I downloaded the .gz files from Google repository and copy those into a S3 bucket, then I just make reference to where I placed the files, and that was it.

Hi @david.foster​! Thanks so much for returning to share how you solved the issue.

I hope to see you around the forums again!

Claire