Jan-21-2017, 11:54 PM
Hello,I'm very new to Python, and I'm having trouble with some basic math. I've getting a user to input a number and assigning it to a variable. Then I'm multiplying it by the value of another variable, which I've set to 0.1 and displaying the result. When the input is 3, and I multiply it by the other variable, instead of getting .3, I'm getting .3000000000004.
What could cause the multiplication to return what appears to be an incorrect value?
What could cause the multiplication to return what appears to be an incorrect value?