Posts: 7,319
Threads: 123
Joined: Sep 2016
Mar-30-2019, 02:02 PM
(This post was last modified: Mar-30-2019, 02:03 PM by snippsat.)
Quote:call record1 the third field s174
>>> new_record
[['1', 'WAN HAI 102', 'S174', 'HIT4', '2019-04-06', '2019-04-07'],
['2', 'WAN HAI 102', 'W174', 'HIT4', '2019-04-06', '2019-04-07'],
['3', 'WAN HAI 102', 'S175', 'HIT4', '2019-04-20', '2019-04-21'],
['4', 'WAN HAI 102', 'W175', 'HIT4', '2019-04-20', '2019-04-21'],
['5', 'WAN HAI 102', 'S176', 'HIT4', '2019-05-04', '2019-05-05'],
['6', 'WAN HAI 102', 'W176', 'HIT4', '2019-05-04', '2019-05-05']]
>>> new_record[0][2]
'S174'
>>> # All on index 2
>>> [i[2] for i in new_record]
['S174', 'W174', 'S175', 'W175', 'S176', 'W176']
Posts: 10
Threads: 1
Joined: Mar 2019
(Mar-30-2019, 02:02 PM)snippsat Wrote: Quote:call record1 the third field s174
>>> new_record
[['1', 'WAN HAI 102', 'S174', 'HIT4', '2019-04-06', '2019-04-07'],
['2', 'WAN HAI 102', 'W174', 'HIT4', '2019-04-06', '2019-04-07'],
['3', 'WAN HAI 102', 'S175', 'HIT4', '2019-04-20', '2019-04-21'],
['4', 'WAN HAI 102', 'W175', 'HIT4', '2019-04-20', '2019-04-21'],
['5', 'WAN HAI 102', 'S176', 'HIT4', '2019-05-04', '2019-05-05'],
['6', 'WAN HAI 102', 'W176', 'HIT4', '2019-05-04', '2019-05-05']]
>>> new_record[0][2]
'S174'
>>> # All on index 2
>>> [i[2] for i in new_record]
['S174', 'W174', 'S175', 'W175', 'S176', 'W176']
brother thanks for your big help
Posts: 10
Threads: 1
Joined: Mar 2019
import requests
from bs4 import BeautifulSoup
import datetime
url="https://cplus.hit.com.hk/enquiry/vesselScheduleEnquiryAction.do"
ETBFrom=datetime.date.today().strftime("%d-%m-%Y")
ETBTo='11-05-2019'
query='%E6%90%9C%E7%B4%A2'
f = open("file.txt", "r")
print(f.read())
vesselName=["WAN+HAI+272","WAN+HAI+172","WAN+HAI+171"]
for vslcount in vesselName:
#print(vslcount)
new_url = f'{url}?vesselName={vslcount}&ETBFrom={ETBFrom}&ETBTo={ETBTo}&query={query}'
response = requests.get(new_url)
soup = BeautifulSoup(response.content, 'html.parser')
vessel = soup.find_all('td', class_="body")[1]
record=[]
for item in vessel.find_all('td')[6:-3]:
record.append(item.text.strip())
new_record = []
for i in range(0,len(record),6):
new_record.append(record[i:i+6])
print(new_record[0:2]) file.txt:
VSLNAME1
VSLNAME2
Brother ,how to get vsl name all txt file
Posts: 10
Threads: 1
Joined: Mar 2019
brother
Final i want check my data compare with the web
Posts: 7,319
Threads: 123
Joined: Sep 2016
Apr-04-2019, 07:00 PM
(This post was last modified: Apr-04-2019, 07:00 PM by snippsat.)
(Apr-04-2019, 07:26 AM)yimchiwai Wrote: Brother ,how to get vsl name all txt file import requests
from bs4 import BeautifulSoup
import datetime
url="https://cplus.hit.com.hk/enquiry/vesselScheduleEnquiryAction.do"
ETBFrom=datetime.date.today().strftime("%d-%m-%Y")
ETBTo='11-05-2019'
query='%E6%90%9C%E7%B4%A2'
vesselName=["WAN+HAI+272","WAN+HAI+172","WAN+HAI+171"]
for vslcount in vesselName:
with open(f'{vslcount}.txt', 'w', encoding='utf-8') as f_out:
#print(vslcount)
new_url = f'{url}?vesselName={vslcount}&ETBFrom={ETBFrom}&ETBTo={ETBTo}&query={query}'
response = requests.get(new_url)
soup = BeautifulSoup(response.content, 'html.parser')
vessel = soup.find_all('td', class_="body")[1]
record=[]
for item in vessel.find_all('td')[6:-3]:
record.append(item.text.strip())
new_record = []
for i in range(0,len(record),6):
new_record.append(record[i:i+6])
print(new_record[0:2])
f_out.write(f'{new_record[0:2]}') (Apr-04-2019, 12:04 PM)yimchiwai Wrote: Final i want check my data compare with the web You have to try to do something
Posts: 10
Threads: 1
Joined: Mar 2019
brother...
help help help
Posts: 7,319
Threads: 123
Joined: Sep 2016
(Apr-05-2019, 12:24 AM)yimchiwai Wrote: help help help Should at least answer if my last code to write you web stuff to files did work for you
You have to show some effort in solving the last task,if not post in job part of forum.
Posts: 10
Threads: 1
Joined: Mar 2019
Sorry Bro Cause i try to let program online...but concept is wrong...
but dont know how to fix......let you trouble...so sorry....
so....let you know my final way..
import requests
from bs4 import BeautifulSoup
import datetime
from flask import Flask
app = Flask(__name__)
@app.route('/')
def hello_world():
return checkvoy
def checkvoy():
url="https://cplus.hit.com.hk/enquiry/vesselScheduleEnquiryAction.do"
ETBFrom=datetime.date.today().strftime("%d-%m-%Y")
ETBTo='11-05-2019'
query='%E6%90%9C%E7%B4%A2'
f = open("file.txt", "r")
print(f.read())
vesselName=["WAN+HAI+272","WAN+HAI+172","WAN+HAI+171"]
for vslcount in vesselName:
#print(vslcount)
new_url = f'{url}?vesselName={vslcount}&ETBFrom={ETBFrom}&ETBTo={ETBTo}&query={query}'
response = requests.get(new_url)
soup = BeautifulSoup(response.content, 'html.parser')
vessel = soup.find_all('td', class_="body")[1]
record=[]
for item in vessel.find_all('td')[6:-3]:
record.append(item.text.strip())
new_record = []
for i in range(0,len(record),6):
new_record.append(record[i:i+6])
print(new_record[0:2])
if __name__ == '__main__':
app.run()
|