Not all sites work with pysftp.
Can write something with ftplib.
So can write some different.
All files okay,not recursive for sub-folders that can be a fine traning task to write.
Can write something with ftplib.
from ftplib import FTP import os ftp = FTP('ftp.debian.org') ftp.login() ftp.cwd('debian') ftp.cwd('doc') # get filenames within the directory filenames = ftp.nlst() print(filenames) # download the files for file in filenames: if '.txt' in file: file_name = os.path.join(r"E:/div", file) lf = open(file_name, "wb") ftp.retrbinary("RETR " + file, lf.write) lf.close()Work almost there problem with 1 file in directory that stall download.
So can write some different.
All files okay,not recursive for sub-folders that can be a fine traning task to write.
from bs4 import BeautifulSoup import requests from urllib.request import urlretrieve url = 'http://ftp.debian.org/debian/doc/' url_get = requests.get(url) soup = BeautifulSoup(url_get.content, 'lxml') data = soup.find_all('a') for name in data: if 'txt' in name.get('href'): urlretrieve(url+name.get('href'), name.get('href'))