Bottom Page

Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
 requests - handling exceptions
#1
I'm reading Web Scraping with Python by Ryan Mitchell and trying to use requests instead of urlib. When dealing with exceptions if the issue is that page doesn't exist this code works well:
import requests
from bs4 import BeautifulSoup
try:
    html = requests.get("http://pythonscraping.com/pages/p1.html")
    html.raise_for_status()
except requests.exceptions.HTTPError as e:
	print(e)
Output:
404 Client Error: Not Found for url: http://pythonscraping.com/pages/p1.html
but what to do when server could not be reached at all? Urlib has URLError function but requests module doesn't accept it.

I may have some more issues in the future with exceptions and this topic will be suitable...
Quote
#2
Look at A Python guide to handling HTTP request failures.
Truman likes this post
Quote
#3
Thank you, I assume that answer is ConnectionError.
Quote

Top Page

Possibly Related Threads...
Thread Author Replies Views Last Post
  error with requests rudolphyaber 9 450 May-20-2019, 06:19 PM
Last Post: rudolphyaber
  Cherrypy - no response to requests kryszen2 6 395 May-19-2019, 09:57 AM
Last Post: kryszen2
  XML handling Kultz 4 452 Dec-27-2018, 08:40 AM
Last Post: Kultz
  Python handling Apache Request harzsr 3 540 Nov-16-2018, 04:36 AM
Last Post: nilamo
  flask requests display data from api on webpage with javacript pascale 0 479 Oct-25-2018, 08:30 PM
Last Post: pascale
  Error in requests.post debanilroy 3 1,067 Sep-18-2018, 06:15 PM
Last Post: snippsat
  help about requests download ggbaby 1 413 Sep-18-2018, 03:44 AM
Last Post: wavic
  Requests login failure test 10 1,531 Sep-13-2018, 08:48 AM
Last Post: test
  Inserting multiple cookies using python requests freshtomatoes 0 937 Aug-16-2018, 09:25 PM
Last Post: freshtomatoes
  Logging into a website with requests HiImNew 3 1,033 Jul-06-2018, 06:00 AM
Last Post: HiImNew

Forum Jump:


Users browsing this thread: 1 Guest(s)