Uploaded image for project: 'MongoDB Database Tools'
  1. MongoDB Database Tools
  2. TOOLS-2571

Large JSON data set export missing commas

    • Type: Icon: Bug Bug
    • Resolution: Works as Designed
    • Priority: Icon: Minor - P4 Minor - P4
    • None
    • Affects Version/s: 3.4.23
    • Component/s: mongoexport
    • Labels:
    • Environment:

      The file size is 1413MB.  There are many entries in the database being exported that don't have comma separators.  

      The data I'm exporting is my server's StackStorm workflow execution with the following command:

      'mongoexport -d st2 -c workflow_execution_d_b -o 05-03-2020_workflows_export.json --pretty'

      I found the issue when attempting to parse the JSON with Python.

      def parse(file_name):
        with open(file_name, 'r') as file:
          json_data = json.load(file)
        print(json.dumps(json_data))

      parse('05-03-2020_actions_export.json')

      Which gives the error:

      Traceback (most recent call last):
      File "st2_utils/json_parser.py", line 10, in <module>
      parse('05-03-2020_actions_export.json')
      File "st2_utils/json_parser.py", line 6, in parse
      json_data = json.load(file)
      File "/Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/json/_init_.py", line 293, in load
      return loads(fp.read(),
      File "/Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/json/_init_.py", line 357, in loads
      return _default_decoder.decode(s)
      File "/Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/json/decoder.py", line 340, in decode
      raise JSONDecodeError("Extra data", s, end)
      json.decoder.JSONDecodeError: Extra data: line 9130 column 2 (char 301892)

            Assignee:
            Unassigned Unassigned
            Reporter:
            brian.thompson@mastercard.com Brian Thompson
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

              Created:
              Updated:
              Resolved: