Aug-14-2019, 01:14 PM
Hey,
first of all sorry for my bad english but i trials it.
I try to create a cookie scanner that get all cookies (first and third party) from a website.
My first idea was to open a website with the selenium chromedriver and read the sqlite3 database which the chrome browser creates. I would do that because selenium itself can't read third party cookies. My problem is that the database is sometimes empty and sometimes not. If i open the database with python i get actually always an empty result but sometimes i open the database after my code is finished with a sqlite browser i get the cookies. I don't know why...
For example i open a mozilla website.
![[Image: Aq4p6.png]](https://i.stack.imgur.com/Aq4p6.png)
Now after much problems i asking me if there a more efficient way to get all cookies from a website (first and third party) without selenium?
Thanks for your help!
first of all sorry for my bad english but i trials it.
I try to create a cookie scanner that get all cookies (first and third party) from a website.
My first idea was to open a website with the selenium chromedriver and read the sqlite3 database which the chrome browser creates. I would do that because selenium itself can't read third party cookies. My problem is that the database is sometimes empty and sometimes not. If i open the database with python i get actually always an empty result but sometimes i open the database after my code is finished with a sqlite browser i get the cookies. I don't know why...
For example i open a mozilla website.
from selenium import webdriver import os, shutil, sqlite3 browser_list_place = 0 browser_list = [] profiles_folder = "profiles" def getcookies(url): if os.path.isdir(profiles_folder): shutil.rmtree(profiles_folder) co = webdriver.ChromeOptions() co.add_argument("--no-sandbox") co.add_argument("--user-data-dir=" + profiles_folder + "/" + str(browser_list_place)) browser_list.append(webdriver.Chrome('D:\crawler\chromedriver.exe', options=co)) browser_list[browser_list_place].set_page_load_timeout(30) browser_list[browser_list_place].get(url) #browser_list[browser_list_place].quit() for folder in range(0, browser_list_place + 1): con = sqlite3.connect(profiles_folder + "/" + str(folder) + "/Default/Cookies") cur = con.cursor() cur.execute("SELECT * FROM cookies") rows = cur.fetchall() for row in rows: print(row) getcookies('https://developer.mozilla.org/de/')In the chrome browser i can see 7 cookies.
![[Image: Aq4p6.png]](https://i.stack.imgur.com/Aq4p6.png)
Now after much problems i asking me if there a more efficient way to get all cookies from a website (first and third party) without selenium?
Thanks for your help!