Aug-01-2018, 03:36 AM
I've wrote a program that calculates a the average MPH speed of a race whose distance is measured in kilometers by default. My program runs, but I can't figure out how to verify I'm not getting any semantic errors in the result that prints out:
#!/usr/bin/env python3 #ThinkPythonExercise1_4.py ##Example Scenario: ##If you run a 10 kilometer race in 43 minutes 29 seconds, what is your average time per mile? What ##is your average speed in miles per hour? (Hint: there are 1.61 kilometers in a mile). ##Then it would be distance / ((43 * 60 + 29)/(60 * 60)) hours. def getDistance(): Km = int(input("How many kilometers is the race? ")) miles = Km * 0.62 #would be Km * 1.61 if converting Kilometers to miles print("In that case, the distance in miles is " + str(miles) + " miles.") return miles,Km def getTime(): print("Enter minutes and seconds it took to finish the race:") minutes = int(input("Minutes: ")) seconds = int(input("Seconds: ")) return minutes,seconds def main(): Km,miles = getDistance() minutes,seconds = getTime() averageSpeed = miles / ((((minutes * 60) + seconds))/3600) print("Your average MPH time was " + str(averageSpeed) + " MPH.") main() #distance = speed * time #time = distance / speed #speed = distance / timeAny ideas?