Question:
I’m using the Hadoop library to upload files in S3. Because of some metric configuration file is missing I’m getting this exception
1 2 3 4 |
MetricsConfig - Could not locate file hadoop-metrics2-s3a-file-system.properties org.apache.commons.configuration2.ex.ConfigurationException: Could not locate: org.apache.commons.configuration2.io.FileLocator@77f46cee[fileName=hadoop-metrics2-s3a-file-system.properties,basePath= |
My current configurations are
1 2 3 |
configuration.set("fs.s3a.access.key", "accessKey") configuration.set("fs.s3a.secret.key", "secretKey") |
Where to add this configuration file? What to add to that configuration file?
Answer:
don’t worry about it, it’s just an irritating warning. It’s only relevant when you have the s3a or abfs connectors running in a long-lived app where the metrics are being collected and fed to some management tooling.
Set the log level to warn in the log4j.properties file in your spark conf dir
1 2 |
log4j.logger.org.apache.hadoop.metrics2=WARN |