Jul-14-2021, 03:31 PM
I'm trying to write a program that takes two random integers from 0-10 and divides them by each other and have the user enter the answer to one decimal place.
I keep on getting incorrect even though I think I'm trying in the correct answer.
The code is below. Please help.
I keep on getting incorrect even though I think I'm trying in the correct answer.
The code is below. Please help.
import random first = random.randint(0,10) second = random.randint(0,10) symbol = " / " print(first, symbol, second, "=") answer = first / second answer = ("{:.1f}".format(answer)) # division answer to 1 decimal place print(answer) # The print is for testing purposes. user_answer = float(input("enter you answer to one decimal place")) # mark answer if user_answer == answer: # correct print("Correct!") else: # incorrect print("Incorrect!")