Python Forum

Full Version: Fastest Way of Writing/Reading Data
You're currently viewing a stripped down version of our content. View the full version with proper formatting.
I have inherited a script that scrapes data from the web every few seconds and stores it in a mysql database. It then deletes data that is more than 30 seconds old. This means there is a maximum of 5-6 records in the database. Each record consists of 2 fields (scrape time and the data).

I don't know why mysql has been used to store the data but it seems unnecessarily cumbersome (although I am a beginner).

Is my thinking correct? If so, what would be the quickest way to write/read/clear data in these circumstances?
you can use a flat file for storage, open using mode 'w+', ('a+' if you wish to keep old contents) and after each write make a call to flush()

example -- something like this (won't run as written, but syntax is there):
import os
from pathlib import Path

# following assures starting directory same as script.
os.chdir(os.path.abspath(os.path.dirname(__file__)))

# create ref to script directory
homepath = Path('.')

# Here, create a subdirectory to hold output data (move this to wherever you wish)
# if paths exists, do nothing. If not create them
OutDatapath = homepath / 'data_out'
OutDatapath.mkdir(exist_ok=True)

outfile = OutDatapath / 'MyData.txt' # Rename this as you wish

with outfile.open('w+') as fp:
    # ... your code here
    fp.write(scraped_data)
    fp.flush()
since the file is flushed each time written, it can be opened for read by another program, and get all data up to the last write.