I use sparksql jdbc to load data from SQL Server that include 0.000000 [decimal(28,12)], and then save DataFrame into MongoDB, I find
{"Position" : NumberDecimal("0E-12")}is saved in MongoDB. When I load these data from MongoDB to DataFrame to show, the exception Decimal scale (12) cannot be greater than precision (1).; is thrown. If I manually update the document into
{"Position" : NumberDecimal("0")}, it works fine. Is it a bug? Could you tell me how to fix it?
- is duplicated by
-
SPARK-187 Infer decimals that have precision values larger than the scale
- Closed