Jul-27-2021, 10:17 AM
I have inherited a script that scrapes data from the web every few seconds and stores it in a mysql database. It then deletes data that is more than 30 seconds old. This means there is a maximum of 5-6 records in the database. Each record consists of 2 fields (scrape time and the data).
I don't know why mysql has been used to store the data but it seems unnecessarily cumbersome (although I am a beginner).
Is my thinking correct? If so, what would be the quickest way to write/read/clear data in these circumstances?
I don't know why mysql has been used to store the data but it seems unnecessarily cumbersome (although I am a beginner).
Is my thinking correct? If so, what would be the quickest way to write/read/clear data in these circumstances?