Python Forum

Full Version: [Solved]Help to iterate through code
You're currently viewing a stripped down version of our content. View the full version with proper formatting.
Hello,

I have a code that tracks the price of a product on Amazon.

I made an SQLite table that stores the: Product, URL, Alert_Price.

Furthermore, I would like to be able to have the code iterate through the table, grabbing all the product URL's and Alert_Prices, so they can get plugged into the code and I can get notified if any of the prices drop to the Alert_Price.

For some reason, I am drawing a blank on how to approach this
So, any ideas on how I should go about approaching this?

Thanks in advance.

This is what I have now but it only works with one product, not multiple:
import requests
from bs4 import BeautifulSoup
import sqlite3
from BX_Constants import (MainDatabase)

#------------------------------------------
#            Get Products to Track
#------------------------------------------
#Connect to the database
connection = sqlite3.connect(MainDatabase)
cursor = connection.cursor()

#Get all the info (Product (name), URL (Product's URL), Price_Alert (Budget Price))
cursor.execute("SELECT Product, URL, Alert_Price FROM AmazonPriceTracker")
connection.commit()
Result = cursor.fetchall() #Return all the URLS
connection.close() #Close the connection

print(Result , '\n')
#------------------------------------------

# Set your budget
my_price = 300

# initializing Currency Symbols to substract it from our string
currency_symbols = ['€', '	£', '$', "¥", "HK$", "₹", "¥", "," ] 

# the URL we are going to use
URL = 'https://www.amazon.ca/MSI-Geforce-192-bit-Support-Graphics/dp/B07ZHDZ1K6/ref=sr_1_16?crid=1M9LHOYX99CQW&keywords=Nvidia%2BGTX%2B1060&qid=1670109381&sprefix=nvidia%2Bgtx%2B1060%2Caps%2C79&sr=8-16&th=1'

headers = {
'authority': 'www.amazon.com',
'pragma': 'no-cache',
'cache-control': 'no-cache',
'dnt': '1',
'upgrade-insecure-requests': '1',
'user-agent': 'Mozilla/5.0 (X11; CrOS x86_64 8172.45.0) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/51.0.2704.64 Safari/537.36',
'accept': 'text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,image/apng,*/*;q=0.8,application/signed-exchange;v=b3;q=0.9',
'sec-fetch-site': 'none',
'sec-fetch-mode': 'navigate',
'sec-fetch-dest': 'document',
'accept-language': 'en-GB,en-US;q=0.9,en;q=0.8',
}

#Checking the price
def checking_price():
    # page = requests.get(URL, headers=headers)
    response = requests.get(URL, headers=headers)
    soup = BeautifulSoup(response.content, "html.parser")

    #Finding the elements
    product_title = soup.find('span', class_ = "a-size-large product-title-word-break").getText()
    product_price = soup.find('span', class_ = "a-offscreen").getText()

    # using replace() to remove currency symbols
    for i in currency_symbols : 
        product_price = product_price.replace(i,'')

    ProductTitleStrip = product_title.strip()
    ProductPriceStrip = product_price.strip()
    print(ProductTitleStrip)
    print(ProductPriceStrip)

    #Converting the string to integer
    product_price = int(float(product_price))

    # checking the price
    if(product_price<my_price):
        print("You Can Buy This Now!")
    else:
        print("The Price Is Too High!")


checking_price()

# while True:
#     checking_price()
#     time.sleep(3600) #Run every hour 
Hard code your solution to get prices for 2 products. Look for repeated code. That is the code you will execute inside a loop, once for each product.
I would just like to know how to grab each result from the table & plug those results into my code.

This is what's outputted so far:
Output:
[('GTX 1660', 'https://www.amazon.ca/MSI-Geforce-192-bit-Support-Graphics/dp/B07ZHDZ1K6/ref=sr_1_16?crid=1M9LHOYX99CQW&keywords=Nvidia%2BGTX%2B1060&qid=1670109381&sprefix=nvidia%2Bgtx%2B1060%2Caps%2C79&sr=8-16&th=1', '300'), ('27-1 Terminal ScrewDriver', 'https://www.amazon.ca/Multi-Bit-Precision-Screwdriver-Tamperproof/dp/B09SVP9NDQ/ref=sr_1_2?crid=HXK64C1DIYEK&keywords=klein+27+in+1&qid=1670119650&sprefix=klein+27+in+1%2Caps%2C175&sr=8-2', '30'), ('2TB SSD', 'https://www.amazon.ca/Blue-NAND-2TB-SSD-WDS200T2B0A/dp/B073SBRHH6/ref=sr_1_16?crid=1XWXDCH9XNCTE&keywords=2tb+ssd&qid=1670784581&sprefix=2tb+ssd%2Caps%2C168&sr=8-16', '200')] MSI Gaming Geforce GTX 1660 Super 192-bit HDMI/DP 6GB GDRR6 HDCP Support DirectX 12 Dual Fan VR Ready OC Graphics Card 339.99 The Price Is Too High!
I want to take the URL & Alert_Price from my tables and plug them into the code, but I don't know how to do it since they all get spit out together and in a list.
How much do you know about iterators? Do you know what this will print?
for x in [1, 2, 3]:
    print(x)
How much do you know about indexing? Do you know what this will print?
numbers = [1, 2, 3]
print(numbers[1])
Combining iterating and indexing. Do you know what this will print?
items = [[1, 2, 3], [4, 5, 6]]
for item in items:
    print(item[1])
If you can answer all three questions I don't understand why you cannot see how to iterate through your database query and get the product, url and price. It is exactly the same thing.
Thanks, that helped me understand it better.

The only problem I have now is that my code only outputs the last item in my table/list not the other 2 items.

How would I go about fixing this, so it displays all the items; not just the last one in the list?


Thanks again.

Output:
[('GTX 1660', 'https://www.amazon.ca/MSI-Geforce-192-bit-Support-Graphics/dp/B07ZHDZ1K6/ref=sr_1_16?crid=1M9LHOYX99CQW&keywords=Nvidia%2BGTX%2B1060&qid=1670109381&sprefix=nvidia%2Bgtx%2B1060%2Caps%2C79&sr=8-16&th=1', '300'), ('27-1 Terminal ScrewDriver', 'https://www.amazon.ca/Multi-Bit-Precision-Screwdriver-Tamperproof/dp/B09SVP9NDQ/ref=sr_1_2?crid=HXK64C1DIYEK&keywords=klein+27+in+1&qid=1670119650&sprefix=klein+27+in+1%2Caps%2C175&sr=8-2', '30'), ('2TB SSD', 'https://www.amazon.ca/Blue-NAND-2TB-SSD-WDS200T2B0A/dp/B073SBRHH6/ref=sr_1_16?crid=1XWXDCH9XNCTE&keywords=2tb+ssd&qid=1670784581&sprefix=2tb+ssd%2Caps%2C168&sr=8-16', '200')] Western Digital 2TB WD Blue 3D NAND Internal PC SSD - SATA III 6 Gb/s, 2.5"/7mm, Up to 560 MB/s - WDS200T2B0A 229.99 The Price Is Too High!
Full Code:
import requests
from bs4 import BeautifulSoup
import sqlite3
from BX_Constants import (MainDatabase)

#------------------------------------------
#            Get Products to Track
#------------------------------------------
#Connect to the database
connection = sqlite3.connect(MainDatabase)
cursor = connection.cursor()

#Get all the Reminders
cursor.execute("SELECT Product, URL, Alert_Price FROM AmazonPriceTracker")
connection.commit()
Result = cursor.fetchall() #Return all the Results
connection.close() #Close the connection

print(Result , '\n')
#------------------------------------------
#--------------------------
# #Prints Product
# for result in Result:
#     print(result[0])

# #Prints URL
# for result in Result:
#     print(result[1])

# #Prints Alert_Price
# for result in Result:
#     print(result[2])
#--------------------------

for result in Result:
    product_name = result[0]
    URL = result[1]
    my_price = int(result[2])
    
# Set your budget
# my_price = 300

# initializing Currency Symbols to substract it from our string
currency_symbols = ['€', '	£', '$', "¥", "HK$", "₹", "¥", "," ] 

# the URL we are going to use
#URL = 'https://www.amazon.ca/MSI-Geforce-192-bit-Support-Graphics/dp/B07ZHDZ1K6/ref=sr_1_16?crid=1M9LHOYX99CQW&keywords=Nvidia%2BGTX%2B1060&qid=1670109381&sprefix=nvidia%2Bgtx%2B1060%2Caps%2C79&sr=8-16&th=1'

headers = {
'authority': 'www.amazon.com',
'pragma': 'no-cache',
'cache-control': 'no-cache',
'dnt': '1',
'upgrade-insecure-requests': '1',
'user-agent': 'Mozilla/5.0 (X11; CrOS x86_64 8172.45.0) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/51.0.2704.64 Safari/537.36',
'accept': 'text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,image/apng,*/*;q=0.8,application/signed-exchange;v=b3;q=0.9',
'sec-fetch-site': 'none',
'sec-fetch-mode': 'navigate',
'sec-fetch-dest': 'document',
'accept-language': 'en-GB,en-US;q=0.9,en;q=0.8',
}

#Checking the price
def checking_price():
    # page = requests.get(URL, headers=headers)
    response = requests.get(URL, headers=headers)
    soup = BeautifulSoup(response.content, "html.parser")

    #Finding the elements
    product_title = soup.find('span', class_ = "a-size-large product-title-word-break").getText()
    product_price = soup.find('span', class_ = "a-offscreen").getText()

    # using replace() to remove currency symbols
    for i in currency_symbols : 
        product_price = product_price.replace(i,'')

    ProductTitleStrip = product_title.strip()
    ProductPriceStrip = product_price.strip()
    print(ProductTitleStrip)
    print(ProductPriceStrip)

    #Converting the string to integer
    product_price = int(float(product_price))

    # checking the price
    if(product_price<my_price):
        print("You Can Buy This Now!")
    else:
        print("The Price Is Too High!")


checking_price()

# while True:
#     checking_price()
#     time.sleep(3600) #Run every hour 
You need to call check_price for each product. I would refactor the code, changing check_price() to get_price(), a function that returns the current price for a product.
con = sqlite3.connect(MainDatabase)
for product, url, buy_price in con.execute("SELECT Product, URL, Alert_Price FROM AmazonPriceTracker"):
   current_price = get_price(url)
    if current_price < float(buy_price):
        print(f"Buy {product} ${current_price}")
Thanks!

That helped solved my problem.
Don't let apparent complexity intimidate you. This long list of tuples:
Output:
[('GTX 1660', 'https://www.amazon.ca/MSI-Geforce-192-bit-Support-Graphics/dp/B07ZHDZ1K6/ref=sr_1_16?crid=1M9LHOYX99CQW&keywords=Nvidia%2BGTX%2B1060&qid=1670109381&sprefix=nvidia%2Bgtx%2B1060%2Caps%2C79&sr=8-16&th=1', '300'), ('27-1 Terminal ScrewDriver', 'https://www.amazon.ca/Multi-Bit-Precision-Screwdriver-Tamperproof/dp/B09SVP9NDQ/ref=sr_1_2?crid=HXK64C1DIYEK&keywords=klein+27+in+1&qid=1670119650&sprefix=klein+27+in+1%2Caps%2C175&sr=8-2', '30'), ('2TB SSD', 'https://www.amazon.ca/Blue-NAND-2TB-SSD-WDS200T2B0A/dp/B073SBRHH6/ref=sr_1_16?crid=1XWXDCH9XNCTE&keywords=2tb+ssd&qid=1670784581&sprefix=2tb+ssd%2Caps%2C168&sr=8-16', '200')]
Is no different than this:
[("a", "b", "c"), ("a", "b", "c"), ("a", "b", "c")]