Writing DF with override mode drops collection before reading it

XMLWordPrintableJSON

    • Type: Bug
    • Resolution: Works as Designed
    • Priority: Minor - P4
    • None
    • Affects Version/s: 2.2.0
    • Component/s: API
    • None
    • Environment:
      Ubuntu 14.04, scala 2.11.8, spark 2.2.0, mongo 3.4.7
    • None
    • None
    • None
    • None
    • None
    • None

      When loading a DF from a collection, transforming/appending data and trying to save the result to the same collection with override mode, the collection is erased before Spark accomplishes evaluation and only the new appended data is inserted (old data is lost).

      If a Spark action (like a count) is triggered before the insert, the loading is done before de dropping and the process works fine.

      See attached code to reproduce the issue.

        1. Bug.scala
          1 kB
          Alvaro Berdonces

            Assignee:
            Ross Lawley
            Reporter:
            Alvaro Berdonces
            None
            Votes:
            0 Vote for this issue
            Watchers:
            3 Start watching this issue

              Created:
              Updated:
              Resolved: