Python Forum
Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
multiple instances
#1
Hello,

Could you please help me out a bit. Basically I'm doing a command line monitoring tool for my Axis cameras. I put a while loop in the init method, so the second instance is never created:

import requests
import time


class Axis:

    def __init__(self, ip='192.168.0.90', username='root', password='root', reachable=True):
        self.ip = ip
        self.username = username
        self.password = password
        self.base_url = 'http://{}:{}@{}/axis-cgi/'.format(username, password, ip)
        self.reachable = reachable

    def getVideoStatus(self):
        while True:
            try:
                r = requests.get('{}videostatus.cgi?status=1'.format(self.base_url))
                self.reachable = True
            except:
                self.reachable = False
            time.sleep(10)


garden = Axis('10.0.0.10', 'root', 'root')
garden.getVideoStatus()

# this never gets executed :
lobby = Axis('10.0.0.20', 'root', 'root')
lobby.getVideostatus()
Reply
#2
you can instantiate garden and lobby in one operation,
then use threads to run each independently.
Reply
#3
You can solve this with asyncio and aiohttp.
Please look into the documentation, if you use aiohttp: https://aiohttp.readthedocs.io/en/stable/

import itertools
import asyncio
import aiohttp


async def get(url):
    try:
        async with aiohttp.request('GET', url) as req:
            text = await req.text()
            status = req.status
    except Exception as e:
        print(e)
        # Catch explicit Exceptions
        # this catches all, which is not good
    else:
        # return the values, when successful
        return text, status

async def main(urls):
    for url in itertools.cycle(urls):
        # is an endless loop
        text, status = await get(url)
        print(url, status, text[:15])
        # you've to use asyncio.sleep to sleep
        await asyncio.sleep(1)

# program should cycle through this domains
urls = ['http://google.de', 'http://golem.de', 'http://heise.de']
coro = main(urls) # this is the coroutine

try:
    # get the current event loop
    loop = asyncio.get_event_loop()
    # run the loop until it's complete, which will not happen
    loop.run_until_complete(coro)
except KeyboardInterrupt:
    print('\rClosing loop')
    loop.close()
    # does not wait for queued tasks
    # and closes the loop
This requires at least Python 3.6.
Python 3.7 is still better for asyncio.

If you think asycio is too complex, you can use Threads.

Without Threads and without asyncio, you have to handle all different requests inside the loop.
Almost dead, but too lazy to die: https://sourceserver.info
All humans together. We don't need politicians!
Reply


Possibly Related Threads…
Thread Author Replies Views Last Post
Question Using SQLAlchemy, prevent SQLite3 table update by multiple program instances Calab 3 758 Aug-09-2023, 05:51 PM
Last Post: Calab
  Multiprocessing Pool Multiple Instances How to Kill by Pool ID sunny9495 0 766 Nov-16-2022, 05:57 AM
Last Post: sunny9495
  Python: re.findall to find multiple instances don't work but search worked Secret 1 1,223 Aug-30-2022, 08:40 PM
Last Post: deanhystad

Forum Jump:

User Panel Messages

Announcements
Announcement #1 8/1/2020
Announcement #2 8/2/2020
Announcement #3 8/6/2020