You can use the below script to read the content of a file from S3 bucket using boto3 and print the output to the command line.
Step 1: Install and configure boto3
https://cloudaffaire.com/how-to-install-python-boto3-sdk-for-aws/
https://cloudaffaire.com/how-to-configure-python-boto3-sdk-for-aws/
https://pypi.org/project/argparse/
Step 2: Create a script to print the content of an S3 object in the command line
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 |
## Create a python script to print content of an S3 object cat << EOF > print_object.py import argparse import boto3 import json from botocore.exceptions import ClientError parser = argparse.ArgumentParser(description='Print content of an s3 object') parser.add_argument('--bucket_name', type=str, help='The name of the bucket') parser.add_argument('--key', type=str, help='file or folder path in S3') args = parser.parse_args() client = boto3.client('s3') def print_object(bucket_name, key): try: data = client.get_object(Bucket=bucket_name, Key=key) contents = data['Body'].read() print(contents.decode("utf-8")) except ClientError as e: print(e) bucket_name = args.bucket_name key = args.key print_object(bucket_name, key) EOF |
Step 3: Execute the script to print a file contents hosted in an S3 bucket
1 2 |
## Execute the script python3 print_object.py --bucket_name cloudaffaire1 --key targetDir/targetFile |