-
Type:
Bug
-
Resolution: Won't Fix
-
Priority:
Major - P3
-
None
-
Affects Version/s: None
-
Component/s: None
-
None
-
(copied to CRM)
-
None
-
None
-
None
-
None
-
None
-
None
Customer opened a case for Spark connector 10.0.4 related to a change in behavior.
Previously an empty document {} was accepted by the MongoDB Spark connector as an aggregate result while reading a dataframe with a schema like StructType(StructField("foo", ArrayType(String))In such a case, the read dataframe was something like:
foo
—
[]
Now, a strict mode seems to be the default and so raises an exception.It is a breaking change for the customer when migrating an existing Spark application.case
Not sure what can be done for the customer if anything. Any suggestions?
- related to
-
SPARK-376 Use the schema to automatically project fields
-
- Closed
-