Python Forum
Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
requests - handling exceptions
#1
I'm reading Web Scraping with Python by Ryan Mitchell and trying to use requests instead of urlib. When dealing with exceptions if the issue is that page doesn't exist this code works well:
import requests
from bs4 import BeautifulSoup
try:
    html = requests.get("http://pythonscraping.com/pages/p1.html")
    html.raise_for_status()
except requests.exceptions.HTTPError as e:
	print(e)
Output:
404 Client Error: Not Found for url: http://pythonscraping.com/pages/p1.html
but what to do when server could not be reached at all? Urlib has URLError function but requests module doesn't accept it.

I may have some more issues in the future with exceptions and this topic will be suitable...
Reply
#2
Look at A Python guide to handling HTTP request failures.
Reply
#3
Thank you, I assume that answer is ConnectionError.
Reply


Possibly Related Threads…
Thread Author Replies Views Last Post
  POST requests - different requests return the same response Default_001 3 1,900 Mar-10-2022, 11:26 PM
Last Post: Default_001
  flask How to output stderr and exceptions to log file umen 4 4,697 Jun-20-2020, 06:11 AM
Last Post: umen

Forum Jump:

User Panel Messages

Announcements
Announcement #1 8/1/2020
Announcement #2 8/2/2020
Announcement #3 8/6/2020