Python Forum
Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Python Code Help Needed
#1
Hi Could someone post the Thread, in the Jobs Section of the Forum, for me ?

I tried but was unable to before :-

Here is the Text :-

Hi there,

I am currently Struggling, getting Python Codes, to download all .zip Files from all Urls,

That I obtain from a path in www.flightim.com , there are 253 pages, with Zip Files on all of them. When I type my Password, after a few seconds, the message "Login Unsuccesful" displays :-

Can anyone help me, I would be willing to pay a small fee, if someone can help me.

Here is the Python Code :-

import sys
import getpass
import hashlib
import requests
 
 
BASE_URL = 'https://www.flightsim.com/'

LOGIN_PAGE = 'https://www.flightsim.com/vbfs/login.php?do=login'
 

def do_login(credentials):
    session = requests.Session()
    session.get(BASE_URL)
    req = session.post(BASE_URL + LOGIN_PAGE, params={'do': 'login'}, data=credentials)
    if req.status_code != 200:
        print('Login not successful')
        sys.exit(1)
    # session is now logged in
    return session
 
 
def get_credentials():
    username = input('Username: ')
    password = getpass.getpass()
    password_md5 = hashlib.md5(password.encode()).hexdigest()
    return {
        'cookieuser': 1,
        'do': 'login',
        's': '',
        'securitytoken': 'guest',
        'vb_login_md5_password': password_md5,
        'vb_login_md5_password_utf': password_md5,
        'vb_login_password': '',
        'vb_login_password_hint': 'Password',
        'vb_login_username': username,
        }
 
 
credentials = get_credentials()
session = do_login(credentials)

import Fspaths
from bs4 import BeautifulSoup
import requests
 
 
class ScrapeUrlList:
    def __init__(self):
        self.fpath = Fspaths.Fspaths()
        self.ziplinks = []
 
    def get_url(self, url):
        page = None
        response = requests.get(url)
        if response.status_code == 200:
            page = response.content
        else:
            print(f'Cannot load URL: {url}')
        return page
 
    def get_catalog(self):
        base_url = 'https://www.flightsim.com/vbfs'
        with self.fpath.links.open('w' ) as fp:
            baseurl = self.fpath.base_catalog_url
            for pageno in range(1, 254):
                url = f'https://www.flightsim.com/vbfs/fslib.php?searchid=65893537&page={pageno}'
                print(f'url: {url}')
                page = self.get_url(self.fpath.base_catalog_url)
            if page:
                soup = BeautifulSoup(page, 'lxml')
                zip_links = soup.find_all('div', class_="fsc_details")
                for link in zip_links:
                    fp.write(f"{link.find('a').text}, {base_url}/{link.find('a').get('href')}")
                input()
            else:
                print(f'No page: {url}')
 
def main():
    sul = ScrapeUrlList()
    sul.get_catalog()
 
 
if __name__ == '__main__':
    main()
Regards

Eddie Winch
Reply
#2
Hello There,
Hope you are doing good.

I have checked the requirement and I can easily accomplish the mentioned Project in minimum cost, As i am having 5+ year of experience in Same domain.

You can add me on Skype:live:richard_25370

Please check PM

Looking forward for your reply.
I am ready to start work right now.

Best Regards
Richard
Reply
#3
Also I forgot to mention, the Search Id changes after each Session.
Reply
#4
The section in the File Library Section, in the www. flightsim .com Website I want to download all the ZIP Files from is :- PAI: PAI Aircraft
Reply
#5
Hello sir,

Greeting for today..
I have 7+ years of industrial experience. Since 2 years i am working on Python. I have done 2 big projects and worked on some existing project. I can do this task. Kindly add me on Skype that is "[email protected]". My hourly rate is very reasonable. Please add me we will discuss further.

Thanks
Amit
Reply


Possibly Related Threads…
Thread Author Replies Views Last Post
  Code Needs finishing Off Help Needed eddywinch82 19 10,928 May-23-2018, 06:15 AM
Last Post: eddywinch82

Forum Jump:

User Panel Messages

Announcements
Announcement #1 8/1/2020
Announcement #2 8/2/2020
Announcement #3 8/6/2020