Python Forum
Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
lambda layer size issue
#1
Hi all,
I was working with webscraping by Goose and beautiful soup.
Extract the content and upload the urls to dynamodb and the content to Elasticsearch db.

Issue :
Working the same in AWS lambda, which uses layers(python modules for GOOSE,BS4,boto3) which make the size of layer above 250MB.
How to add the modules as layer or any other alternative with all the required python Modules.


Thanks in advance.
Reply
#2
You should try standard scraping packages.
I suggest that you take this tutorial:
https://python-forum.io/Thread-Web-Scraping-part-1
and
https://python-forum.io/Thread-Web-Scraping-part-2
Reply


Possibly Related Threads…
Thread Author Replies Views Last Post
  size of set vs size of dict zweb 0 2,139 Oct-11-2019, 01:32 AM
Last Post: zweb
  Using chunk size issue sandy 1 1,908 Feb-25-2019, 04:43 PM
Last Post: sandy
  Visual Studio Code - PEP8 Lambda Issue Qui_Ten 1 2,721 Jan-28-2019, 08:17 AM
Last Post: buran
  CSV file created is huge in size. How to reduce the size? pramoddsrb 0 10,478 Apr-26-2018, 12:38 AM
Last Post: pramoddsrb
  creating new layer with scapy omerccohen 0 4,164 Sep-20-2017, 02:11 PM
Last Post: omerccohen

Forum Jump:

User Panel Messages

Announcements
Announcement #1 8/1/2020
Announcement #2 8/2/2020
Announcement #3 8/6/2020