I am trying to speed-up one simple search task using all cores of my server cpus. I have simple task search one simple string inside huge text datafile. I want to start for each value one process on each core. I spent some time to figure it out using something like search-lookup-table, what was designed like dictionary with keys string_data and status. My intention was to do something like that:
if some cpu-core is free
look inside search-lookup-table for first string_data with status=0
start search
after finish set status=1
My problem is that I didn't realize how to manage cpu cores and assign them work.
Any advise will be helpfull.
I have server with 64 cores (4 CPUs with 16-cores each) and running Debian Linux v9 with Python v3.7. Datafile is pure text file without structure, simply pure text about 10^6~10^12 lines and ~10^4 chars per line. It's something like "alive", couple times a day it's updated. I tried to split it into smaller files and run search on that smaller files, but after "update" it's useless.
if some cpu-core is free
look inside search-lookup-table for first string_data with status=0
start search
after finish set status=1
My problem is that I didn't realize how to manage cpu cores and assign them work.
Any advise will be helpfull.
I have server with 64 cores (4 CPUs with 16-cores each) and running Debian Linux v9 with Python v3.7. Datafile is pure text file without structure, simply pure text about 10^6~10^12 lines and ~10^4 chars per line. It's something like "alive", couple times a day it's updated. I tried to split it into smaller files and run search on that smaller files, but after "update" it's useless.