Python Forum

Full Version: Unable to use random.choice(list) in async method
You're currently viewing a stripped down version of our content. View the full version with proper formatting.
I need to pull a random proxy from a list in an async method but the code exits the method as soon as it hits the line of code:
proxy = random.choice(proxy_list)
Here's the full method and the next line (the "print" line) never executes.
async def download_site(session, url):
    proxy_list = [
    (754, '38.39.205.220:80'),
    (681, '38.39.204.100:80'),
    (682, '38.39.204.101:80'),
    (678, '38.39.204.94:80')
    ]
    proxy = random.choice(proxy_list)
    print(proxy[1])
    async with session.get(url, proxy="http://" + proxy[1]) as response:
        print("Read {0} from {1}".format(response.content_length, url))
Can someone please share a work-around or a fix for this?
Thanks.
But how are you calling that function? Unless you have something await on it, it won't be scheduled to execute. Can you post the full code? How do you know that is the line that is causing problems? Do you have a traceback?

(It seems to run that section for me when I call it via asyncio.run())
This works

import asyncio
import random as rnd

async def download_site():
    proxy_list = [
        (754, '38.39.205.220:80'),
        (681, '38.39.204.100:80'),
        (682, '38.39.204.101:80'),
        (678, '38.39.204.94:80')
        ]
    await asyncio.sleep(1)
    proxy = rnd.choice(proxy_list)
    print(proxy)

asyncio.run(download_site())
Output:
(682, '38.39.204.101:80')
Thanks @bowlofred and @menator01. This is a small app and I should have just posted all the code the first time.

I added the: await asyncio.sleep(1)
and it still did not work. Please see the code below and THANK YOU very much.

Line 34 does not execute.
Also, once I get this working, I would like to move the list out of this method can get the proxy from the method "get_proxy():

import asyncio
import time
import aiohttp

# This code taken from:
# https://realpython.com/python-concurrency/#what-is-parallelism

# Info for adding headers for the proxy
# https://docs.aiohttp.org/en/stable/client_advanced.html

# Metod to provide random proxies
def get_proxy(self):
    proxy_list = [
    (754, '38.39.205.220:80'),
    (681, '38.39.204.100:80'),
    (682, '38.39.204.101:80'),
    (678, '38.39.204.94:80')
    ]
    proxy = random.choice(proxy_list)
    print(proxy[1])
    return proxy

async def download_site(session, url):
    #For test, use this rather than calling get_proxy
    proxy_list = [
    (754, '38.39.205.220:80'),
    (681, '38.39.204.100:80'),
    (682, '38.39.204.101:80'),
    (678, '38.39.204.94:80')
    ]
    await asyncio.sleep(1)
    proxy = random.choice(proxy_list)
    # This line does not execute
    print(proxy[1])
    async with session.get(url, proxy="http://" + proxy[1]) as response:
        print("Read {0} from {1}".format(response.content_length, url))


async def download_all_sites(sites):
    async with aiohttp.ClientSession() as session:
        tasks = []
        for url in sites:
            task = asyncio.ensure_future(download_site(session, url))
            tasks.append(task)
        await asyncio.gather(*tasks, return_exceptions=True)

if __name__ == "__main__":
    sites = [
        "https://www.jython.org",
       # "http://olympus.realpython.org/dice",
    ] #* 80
    start_time = time.time()
    asyncio.get_event_loop().run_until_complete(download_all_sites(sites))
    duration = time.time() - start_time
    print(f"Downloaded {len(sites)} sites in {duration} seconds")
Note: I modified the main method to return a list of one item rather than 160 to make it easier to step through and debug.
thanks all