Uploaded image for project: 'Compass '
  1. Compass
  2. COMPASS-1764

Compass handles the crazy keys document series

    • Type: Icon: Task Task
    • Resolution: Gone away
    • Priority: Icon: Major - P3 Major - P3
    • None
    • Affects Version/s: None
    • Component/s: CRUD, Performance
    • Labels:
      None

      Usage/steps to reproduce

      # Generate the document
      python3 mkbson.py 
      
      # Import the document
      mongorestore -d perf -c crazy_keys raw_crazy_keys.bson 
      

      Then navigate to the perf.crazy_keys collection in Compass.

      An extended set of documents can be created and imported into MongoDB with this script relatively easily, on my local machine I already have the following numbers of keys and preliminary results:
      32^3=32,768
      48^3=110,592
      64^3=262,144 (slow, ~5-10s)
      80^3=512,000 (slow, ~30s)
      96^3=884,736
      140^3=2,744,000 (Compass 1.9.x betas crash with a memory exception)

      Proposed solution - cap at 100k fields

      #intellectualhonesty - let the data drive your decision.

      Thus Compass, most likely in mongodb-schema, should stop scanning at about 100k fields (configurable as other optimizations are found) in a document and report this Easter egg in the GUI (originally this was Issue 2 in COMPASS-1901).

      Acceptance criteria

      • A mongoimport -able version of these documents, perhaps with easier to understand names like crazy_keys_512000 (might be already done by COMPASS-1766, if so drop up to 2 story points)
      • Cap field or schema processing at 100k fields, reporting this somehow, e.g. with properties such as "totalFieldCount", "analyzedFieldCount" (working titles to capture the semantic, change the syntax as needed).
      • Message displaying this scenario, e.g. "100,000 fields limit reached, remaining N fields are not displayed"
      • Appropriate unit tests

      Out of scope

      • Anything in the schema/indexes/document validation tabs (such as the react-select dropdown to choose an index name), this is just about the Documents Tab.

      Background

      So I let Friday afternoon get to me and challenged myself to create a document with ~2.7 million keys.

      After about 3 minutes, Compass crashes. The mongo shell is still doing something in CPU-land after 8 minutes so far.

      On different data sets, such as documents with ~500,000 fields (i.e. COMPASS-1766), Compass takes ~30 seconds or more to render, or crashes (COMPASS-1764).

      Would explicitly limiting Compass to say 100,000 fields with a message like "Compass cannot render documents with more than 100,000 fields" seem reasonable? Yes but only when the user gets down to the 100,001st field.

      For documents under 100k fields, is a loading spinner enough? (i.e. ~6 seconds, so should not need a progress bar?).

      Proposed Solutions: Potentially push to the GPU? Otherwise just drop to Loading... text.

        1. mkbson.py
          1 kB
        2. raw_crazy_keys.bson
          15.70 MB

            Assignee:
            Unassigned Unassigned
            Reporter:
            peter.schmidt Peter Schmidt
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

              Created:
              Updated:
              Resolved: