Uploaded image for project: 'Spark Connector'
  1. Spark Connector
  2. SPARK-145

Writing DF with override mode drops collection before reading it

    • Type: Icon: Bug Bug
    • Resolution: Works as Designed
    • Priority: Icon: Minor - P4 Minor - P4
    • None
    • Affects Version/s: 2.2.0
    • Component/s: API
    • Labels:
      None
    • Environment:
      Ubuntu 14.04, scala 2.11.8, spark 2.2.0, mongo 3.4.7

      When loading a DF from a collection, transforming/appending data and trying to save the result to the same collection with override mode, the collection is erased before Spark accomplishes evaluation and only the new appended data is inserted (old data is lost).

      If a Spark action (like a count) is triggered before the insert, the loading is done before de dropping and the process works fine.

      See attached code to reproduce the issue.

            Assignee:
            ross@mongodb.com Ross Lawley
            Reporter:
            trivi Alvaro Berdonces
            Votes:
            0 Vote for this issue
            Watchers:
            3 Start watching this issue

              Created:
              Updated:
              Resolved: