Skip to content Skip to sidebar Skip to footer

Setting Python Decimal When Defining Precision Yields Inexact Decimal

Why can't I explicitly declare an exact Decimal object within the precision I configured it to? from decimal import * getcontext().prec = 5 num = Decimal(0.1234) print num Expect

Solution 1:

0.1234 is converted to standard IEEE-754 radix-2 floating point number first, then that approximation is converted to Decimal. Just leave it as a string.

>>> Decimal(0.1234)
Decimal('0.12339999999999999580335696691690827719867229461669921875')
>>> Decimal('0.1234')
Decimal('0.1234')

Post a Comment for "Setting Python Decimal When Defining Precision Yields Inexact Decimal"