Python example of the correct way to divide decimals

  • 2020-04-02 14:00:51
  • OfStack

Find a formula


a=1
b=2
c=3
 
print c*(a/b)

The result of running is always 0, after double checking, I found that in Python, the integer initial integer, only integer.
So a over b is always going to be equal to 0, just change one of the Numbers of a or b to a floating point number.

a=1
b=2
c=3
 
print c*(a/float(b))
print c*(float(a)/b)

So that we can actually figure out exactly what a divided by b is, and of course, if a is bigger than b, and we don't need to have decimals, we don't need to have floats.
Such as:

a=1
b=2
c=3
 
print c/a # 3
print c/b # 1
print c/float(b )# 1.5


Related articles: