Using aws credentials profiles with spark scala app

Question:

I would like to be able to use the ~/.aws/credentials file I maintain with different profiles with my spark scala application if that is possible. I know how to set hadoop configurations for s3a inside my app but I don’t want to keep using different keys hardcoded and would rather just use my credentials file as I do with different programs. I’ve also experimented with using java api such as val credentials = new DefaultAWSCredentialsProviderChain().getCredentials() and then creating an s3 client but that doesn’t allow me to use my keys when reading files from s3. I also know that keys can go in core-site.xml when I run my app but how can I manage different keys and also how can I set it up with IntelliJ so that I can have different keys pulled in using different profiles?

Answer:

DefaultAWSCredentialsProviderChain contains no providers by default. You need to add some, e.g.:

You can use them with S3 client or, as you mention Spark:

To switch between different AWS profiles you could then switch between profiles by setting the AWS_PROFILE environment variable. Happy to expand on any particular point if needed.

Leave a Reply