Aug-26-2019, 11:31 AM
Hi all,
I was working with webscraping by Goose and beautiful soup.
Extract the content and upload the urls to dynamodb and the content to Elasticsearch db.
Issue :
Working the same in AWS lambda, which uses layers(python modules for GOOSE,BS4,boto3) which make the size of layer above 250MB.
How to add the modules as layer or any other alternative with all the required python Modules.
Thanks in advance.
I was working with webscraping by Goose and beautiful soup.
Extract the content and upload the urls to dynamodb and the content to Elasticsearch db.
Issue :
Working the same in AWS lambda, which uses layers(python modules for GOOSE,BS4,boto3) which make the size of layer above 250MB.
How to add the modules as layer or any other alternative with all the required python Modules.
Thanks in advance.