-
Type: Bug
-
Resolution: Works as Designed
-
Priority: Minor - P4
-
None
-
Affects Version/s: 2.2.0
-
Component/s: API
-
None
-
Environment:Ubuntu 14.04, scala 2.11.8, spark 2.2.0, mongo 3.4.7
When loading a DF from a collection, transforming/appending data and trying to save the result to the same collection with override mode, the collection is erased before Spark accomplishes evaluation and only the new appended data is inserted (old data is lost).
If a Spark action (like a count) is triggered before the insert, the loading is done before de dropping and the process works fine.
See attached code to reproduce the issue.