Python Forum

Full Version: Good way to have a worker queue accessible by multiple process?
You're currently viewing a stripped down version of our content. View the full version with proper formatting.
I have a couple scripts scraping data from multiple websites. The next step is processing the data. I want to setup a worker that receives data and process the data. What is a good pipeline/workflow approach to having one worker always running and waiting for task to come in.

I thought something like an API server to process the request. But is there a better solution?
If I were going to do it, I'd probably setup a simple Flask server. You could use sockets or some other kind of inter-process communication (including, maybe, writing to files or a database).
See pyzmq. You can build messaging infrastructure with twenty lines of code. Or less.