Question:
AWS Firehose was released today. I’m playing around with it and trying to figure out how to put data into the stream using AWS CLI. I have a simple JSON payload and the corresponding Redshift table with columns that map to the JSON attributes. I’ve tried various combinations but I can’t seem to pass in the JSON payload via the cli.
What I’ve tried:
1 2 3 4 5 6 7 8 9 10 11 12 |
aws firehose put-record --delivery-stream-name test-delivery-stream --record '{ "attribute": 1 }' aws firehose put-record --delivery-stream-name test-delivery-stream --record { "attribute": 1 } aws firehose put-record --delivery-stream-name test-delivery-stream --record Data='{ "attribute": 1 }' aws firehose put-record --delivery-stream-name test-delivery-stream --record Data={ "attribute": 1 } aws firehose put-record --delivery-stream-name test-delivery-stream --cli-input-json '{ "attribute": 1 }' aws firehose put-record --delivery-stream-name test-delivery-stream --cli-input-json { "attribute": 1 } |
I’ve looked at the cli help which hasn’t helped. This article was published today but looks like the command they use is already outdated as the argument “–firehose-name” has been replaced by “–delivery-stream-name”.
Answer:
Escape the double-quotes around keys and values inside the blob:
1 2 |
aws firehose put-record --delivery-stream-name test-delivery-stream --record '{"Data":"{\"attribute\":1}"}' |