Support copy existing when using Spark streams

XMLWordPrintableJSON

    • Type: New Feature
    • Resolution: Unresolved
    • Priority: Major - P3
    • None
    • Affects Version/s: None
    • Component/s: Source, Stream
    • Needed
    • Hide

      1. What would you like to communicate to the user about this feature?
      Update the FAQ with instructions on how to perform the copy for existing data. Use the syntax example provided by Ross in the comments.

      3. Which versions of the driver/connector does this apply to?
      10.x

      Show
      1. What would you like to communicate to the user about this feature? Update the FAQ with instructions on how to perform the copy for existing data. Use the syntax example provided by Ross in the comments. 3. Which versions of the driver/connector does this apply to? 10.x
    • None
    • None
    • None
    • None
    • None
    • None

      Build a mechanism for ensuring all the data has been synced between the already existing source (typically MongoDB) and sink (typically data lake) and ensure it's done in a only once manner. 

              Assignee:
              Unassigned
              Reporter:
              Ross Lawley
              None
              Votes:
              1 Vote for this issue
              Watchers:
              6 Start watching this issue

                Created:
                Updated: