Python Forum
Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Queue maxsize
#1
I have a problem to control the queue threads that I run.
I want to run 50 threads ( min of two numbers where 50 will be always the minimum) and when one thread is finished a new thread shall be executed so 50 threads are always running.

I added piece of my code

num_theads = min(50, len(tickers))
#set up the queue to hold all the urls
#print('Thread to be prepared')
q = Queue(maxsize=num_theads)
#load up the queue with the urls to fetch and the index for each job (as a tuple):
for ticker in tickers:
    #need the index and the url in each queue item.
    print(ticker)
    q.put(ticker)    
    worker = Thread(target=processData, args=(q,table))
    worker.setDaemon(True)
    worker.start()
def processData(q,table):
    while not q.empty():
        ticker = q.get()
//my code is here 

        q.task_done()
    return True
when I run the code, it just go through the whole list. it seems not 50 work but more
can anyone point me where is the issue I have?
Reply
#2
    q.put(ticker) # <- put the ticker in the queue   
    worker = Thread(target=processData, args=(q,table)) # <- creating a new thread
    worker.setDaemon(True)
    worker.start() # <- start the new thread, Element is consumed directly.
If you observe the queue, you will never hit the maximum queue size, because it's consumed by each thread you start for each element.
Normally there should be X daemon workers, which consumes the same queue.


Here a solution with workers.
The code isn't tested.
from queue import Queue
from threading import Thread, current_thread


def processData(q, table):
    while not q.empty():
        ticker = q.get()
        print(ticker)
        q.task_done()
    # worker ends here
    # you don't need to return anything
    print('Worker', threading.current_thread().ident, 'finished')


def main(num_threads=4, queue_size_min=50):
    table = []
    queue_size = min(queue_size_min, len(tickers))
    q = Queue(maxsize=queue_size)
    thread_parameters = {
        'target': processData,
        'args': (q, table),
        'daemon': True,
    }
    workers = [Thread(**thread_parameters) for _ in range(num_threads)]
    [worker.start() for worker in workers]
    for ticker in tickers:
        q.put(ticker)
    # Threads are started as daemons
    # if there is no non-daemon thread left
    # the Python interpreter will exit
    q.join()
    # q.join waits for the queue until all tasks are finished
Almost dead, but too lazy to die: https://sourceserver.info
All humans together. We don't need politicians!
Reply
#3
I got where is my issue !
thanks
Reply


Possibly Related Threads…
Thread Author Replies Views Last Post
  task queue Valon1981 8 3,516 Jul-07-2020, 07:41 AM
Last Post: freeman
  Queue in Pygame constantin01 1 3,633 Jan-07-2020, 04:02 PM
Last Post: metulburr
  Queue.Queue() would not reduce capacity after get() yuan8421 9 10,956 Jan-02-2018, 09:38 PM
Last Post: Windspar
  Threading and Queue nexusfactor 5 4,221 Oct-16-2017, 04:14 PM
Last Post: Larz60+

Forum Jump:

User Panel Messages

Announcements
Announcement #1 8/1/2020
Announcement #2 8/2/2020
Announcement #3 8/6/2020