It also better to always use Requests
If get
If get
xml
back also need to use parser eg BeautifulSoup.import requests from bs4 import BeautifulSoup fmt = 'xml' zn = '1740064' url = f'https://api.aoikujira.com/zip/zip.php?fmt={fmt}&zn={zn}' response = requests.get(url).content soup = BeautifulSoup(response, 'lxml')Example usage:
>>> [i for i in soup.find_all('result')] [<result name="api.aoikujira.com/zip"></result>, <result version="1.00"></result>, <result request_zip_num="1740064"></result>, <result result_code="1"></result>, <result database="2018-07-30"></result>] >>> att = soup.find('value').attrs >>> att {'ken_kana': 'トウキョウト'} >>> att['ken_kana'] 'トウキョウト'Using
fmt=json
will Requests(encode json()
) and give back a Python dictionary.import requests fmt = 'json' zn = '1740064' url = f'https://api.aoikujira.com/zip/zip.php?fmt={fmt}&zn={zn}' response = requests.get(url).json()Example usage:
>>> response {'API_URL': 'http://api.aoikujira.com/zip/zip.php', 'address': '中台', 'address_kana': 'ナカダイ', 'city': '板橋区', 'city_kana': 'イタバシク', 'database': '2018-07-30', 'ken_kana': 'トウキョウト', 'result': '東京都板橋区中台', 'result_code': 1, 'state': '東京都', 'version': '1.01'} # Call is simple as it's a dictionary >>> response['ken_kana'] 'トウキョウト'