Apr-25-2020, 12:55 PM
I have gone through over five or six Python requests and urllib tutorials, but none of them will download the latest version of a file.
Here is what happens:
It downloads the latest version of bglug.py successfully. Everything works like a charm.
The second time I try, it doesn't fetch the latest version of the file, but I know it fetches the file because I set it up to delete the file first (just for testing purposes) and when I check, the file is there.
And it still won't download the latest version until the next day or so!
Is this server-side caching? Does Python have a web cache that I have to clear?
My code is as follows:
Here is what happens:
It downloads the latest version of bglug.py successfully. Everything works like a charm.
The second time I try, it doesn't fetch the latest version of the file, but I know it fetches the file because I set it up to delete the file first (just for testing purposes) and when I check, the file is there.
And it still won't download the latest version until the next day or so!
Is this server-side caching? Does Python have a web cache that I have to clear?
My code is as follows:
import requests, os os.remove("bglug.py") url = "https://raw.githubusercontent.com/TheTechRobo/bglugwatch-cleanslate/master/bglug.py" r = requests.get(url, stream = True) with open("bglug.py", "wb") as Writefile: for chunk in r.iter_content(chunk_size = 1024): if chunk: Writefile.write(chunk)Does anyone else have the same problem? Thanks!