Skip to content Skip to sidebar Skip to footer

Decimal In Input Json To Python 3 Lambda Function

Consider the following input in a financial application, where precision matters: { 'value': 3.8 } And the following AWS Lambda function: from decimal import Decimal def lambda_h

Solution 1:

Update: There is nothing wrong with your current solution.

There is no float to str to decimal.Decimal round-trip.

As the docs explain (my emphasis):

parse_float, if specified, will be called with the string of every JSON float to be decoded. By default, this is equivalent to float(num_str). This can be used to use another datatype or parser for JSON floats (e.g. decimal.Decimal).


Initial answer below

Passing a float value to decimal.Decimal does not ensure the precision you require. This is because, by its nature, float is not stored as a decimal but in binary.

This can be alleviated if you are able to pass string inputs into decimal.Decimal:

fromdecimal import Decimal

res1 =Decimal(3.8)*Decimal(10)
res2 =Decimal('3.8')*Decimal('10')

print(res1)  # 37.99999999999999822364316060
print(res2)  # 38.0

So one solution would be to ensure you store / read in JSON numeric data as strings instead of floats.

Be careful, an implementation as below may work but relies on str doing a particular job, i.e. rounding a float correctly for decimal representation.

def lambda_handler(event):
    value = event['value']
    print(Decimal(str(value)))

lambda_handler({"value": 3.8})  # 3.8

Post a Comment for "Decimal In Input Json To Python 3 Lambda Function"