Python Forum

Full Version: lambda layer size issue
You're currently viewing a stripped down version of our content. View the full version with proper formatting.
Hi all,
I was working with webscraping by Goose and beautiful soup.
Extract the content and upload the urls to dynamodb and the content to Elasticsearch db.

Issue :
Working the same in AWS lambda, which uses layers(python modules for GOOSE,BS4,boto3) which make the size of layer above 250MB.
How to add the modules as layer or any other alternative with all the required python Modules.


Thanks in advance.
You should try standard scraping packages.
I suggest that you take this tutorial:
https://python-forum.io/Thread-Web-Scraping-part-1
and
https://python-forum.io/Thread-Web-Scraping-part-2