-
Type: Bug
-
Resolution: Won't Fix
-
Priority: Major - P3
-
None
-
Affects Version/s: 2.2.3
-
Component/s: None
-
Labels:None
-
Environment:Linux Mint 18.3
PySpark 2.21
In MongoDB, it is possible to have a negative zero decimal value. The Spark connector cannot read this value.
Example Document:
{ "data" : NumberDecimal("-0.00") }
Trying to read a collection with this document, returns an exception
java.lang.ArithmeticException: Negative zero can not be converted to a BigDecimal at org.bson.types.Decimal128.bigDecimalValue(Decimal128.java:291) at com.mongodb.spark.sql.MongoInferSchema$.com$mongodb$spark$sql$MongoInferSchema$$getDataType(MongoInferSchema.scala:247) at com.mongodb.spark.sql.MongoInferSchema$$anonfun$com$mongodb$spark$sql$MongoInferSchema$$getSchemaFromDocument$1.apply(MongoInferSchema.scala:114) at com.mongodb.spark.sql.MongoInferSchema$$anonfun$com$mongodb$spark$sql$MongoInferSchema$$getSchemaFromDocument$1.apply(MongoInferSchema.scala:114) at scala.collection.Iterator$class.foreach(Iterator.scala:891) at scala.collection.AbstractIterator.foreach(Iterator.scala:1334) at scala.collection.IterableLike$class.foreach(IterableLike.scala:72) at scala.collection.AbstractIterable.foreach(Iterable.scala:54) at com.mongodb.spark.sql.MongoInferSchema$.com$mongodb$spark$sql$MongoInferSchema$$getSchemaFromDocument(MongoInferSchema.scala:114) at com.mongodb.spark.sql.MongoInferSchema$$anonfun$2.apply(MongoInferSchema.scala:78) at com.mongodb.spark.sql.MongoInferSchema$$anonfun$2.apply(MongoInferSchema.scala:78) at scala.collection.Iterator$$anon$11.next(Iterator.scala:410)