Dec-21-2017, 05:44 PM
I have a couple scripts scraping data from multiple websites. The next step is processing the data. I want to setup a worker that receives data and process the data. What is a good pipeline/workflow approach to having one worker always running and waiting for task to come in.
I thought something like an API server to process the request. But is there a better solution?
I thought something like an API server to process the request. But is there a better solution?