By using Boto3’s batch insert, maximum how many records we can insert into Dynamodb’s table. Suppose i’m reading my input json from S3 bucket which is of 6gb in size.
And it cause any performance issues while inserting as a batch. Any sample is helpful. I just started looking into this, based on my findings i’ll update here.
Thanks in advance.
You can find information like this in the service documentation for BatchWriteItem:
A single call to BatchWriteItem can write up to 16 MB of data, which can comprise as many as 25 put or delete requests. Individual items to be written can be as large as 400 KB.
There are no performance issues, aside from consuming the write capacity units.