Hello,
I have a code that tracks the price of a product on Amazon.
I made an SQLite table that stores the: Product, URL, Alert_Price.
Furthermore, I would like to be able to have the code iterate through the table, grabbing all the product URL's and Alert_Prices, so they can get plugged into the code and I can get notified if any of the prices drop to the Alert_Price.
For some reason, I am drawing a blank on how to approach this
So, any ideas on how I should go about approaching this?
Thanks in advance.
This is what I have now but it only works with one product, not multiple:
I have a code that tracks the price of a product on Amazon.
I made an SQLite table that stores the: Product, URL, Alert_Price.
Furthermore, I would like to be able to have the code iterate through the table, grabbing all the product URL's and Alert_Prices, so they can get plugged into the code and I can get notified if any of the prices drop to the Alert_Price.
For some reason, I am drawing a blank on how to approach this
So, any ideas on how I should go about approaching this?
Thanks in advance.
This is what I have now but it only works with one product, not multiple:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 |
import requests from bs4 import BeautifulSoup import sqlite3 from BX_Constants import (MainDatabase) #------------------------------------------ # Get Products to Track #------------------------------------------ #Connect to the database connection = sqlite3.connect(MainDatabase) cursor = connection.cursor() #Get all the info (Product (name), URL (Product's URL), Price_Alert (Budget Price)) cursor.execute( "SELECT Product, URL, Alert_Price FROM AmazonPriceTracker" ) connection.commit() Result = cursor.fetchall() #Return all the URLS connection.close() #Close the connection print (Result , '\n' ) #------------------------------------------ # Set your budget my_price = 300 # initializing Currency Symbols to substract it from our string currency_symbols = [ '€' , ' £' , '$' , "¥" , "HK$" , "₹" , "¥" , "," ] # the URL we are going to use headers = { 'authority' : 'www.amazon.com' , 'pragma' : 'no-cache' , 'cache-control' : 'no-cache' , 'dnt' : '1' , 'upgrade-insecure-requests' : '1' , 'user-agent' : 'Mozilla/5.0 (X11; CrOS x86_64 8172.45.0) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/51.0.2704.64 Safari/537.36' , 'accept' : 'text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,image/apng,*/*;q=0.8,application/signed-exchange;v=b3;q=0.9' , 'sec-fetch-site' : 'none' , 'sec-fetch-mode' : 'navigate' , 'sec-fetch-dest' : 'document' , 'accept-language' : 'en-GB,en-US;q=0.9,en;q=0.8' , } #Checking the price def checking_price(): # page = requests.get(URL, headers=headers) response = requests.get(URL, headers = headers) soup = BeautifulSoup(response.content, "html.parser" ) #Finding the elements product_title = soup.find( 'span' , class_ = "a-size-large product-title-word-break" ).getText() product_price = soup.find( 'span' , class_ = "a-offscreen" ).getText() # using replace() to remove currency symbols for i in currency_symbols : product_price = product_price.replace(i,'') ProductTitleStrip = product_title.strip() ProductPriceStrip = product_price.strip() print (ProductTitleStrip) print (ProductPriceStrip) #Converting the string to integer product_price = int ( float (product_price)) # checking the price if (product_price<my_price): print ( "You Can Buy This Now!" ) else : print ( "The Price Is Too High!" ) checking_price() # while True: # checking_price() # time.sleep(3600) #Run every hour |