Uploaded image for project: 'Core Server'
  1. Core Server
  2. SERVER-4961

$group is taking 2x as long as collection.group()

    Details

    • Type: Bug
    • Status: Closed
    • Priority: Major - P3
    • Resolution: Fixed
    • Affects Version/s: 2.1.0
    • Fix Version/s: 2.2.0-rc2
    • Component/s: Aggregation Framework
    • Labels:
      None
    • Operating System:
      ALL

      Description

      This came from a DISQUS comment on the "Aggregation Framework" page:
      I'm testing MongoDB 2.1.0 in order to evaluate the performance of the new aggregation framework. I'm wondering why it's 2x slower in my use case.

      Here is the code I used before version 2.1.0 (using Python and pymongo):

      db.customers.group(

      {'segment': True}

      , None,

      {'count': 0}

      , "function (obj, prev)

      { prev.count ++; }

      " )

      Here is the same computation using the new aggregation framework:

      db.command('aggregate', 'customers', pipeline=[ {'$group' : { '_id': '$segment', 'count':

      {'$sum': 1}

      }} ])

      On my computer with my dataset, the first version runs in ~1 s, the second version in ~2.5 s. Is it expected or am I doing something wrong?

        Attachments

          Issue Links

            Activity

              People

              • Votes:
                2 Vote for this issue
                Watchers:
                3 Start watching this issue

                Dates

                • Created:
                  Updated:
                  Resolved: