Uploaded image for project: 'Node.js Driver'
  1. Node.js Driver
  2. NODE-2027

MongoError: cursor id ### not found

    XMLWordPrintable

    Details

    • Type: Bug
    • Status: Closed
    • Priority: Major - P3
    • Resolution: Gone away
    • Affects Version/s: 3.0.7
    • Fix Version/s: None
    • Component/s: core
    • Labels:
      None
    • Environment:
      NodeJS Version 10.8.0
      MongoDB Version : 4.0.8
      MongoDB NodeJS Driver version : 3.0.7
      No Mongoose
    • MongoDB Version:
      Not Applicable

      Description

       

      • Node.js Version: v10.8.0
      • MongoDB Version : 4.0.8
      • MongoDB NodeJS Driver version : 3.0.7

      While using MongoDB NodeJS driver's streams API, I constantly get a "MongoError: cursor id 171023944024 not found" error while reading large (>100,000) documents collections. 

      Below is the stack trace:

       

      2019-06-24T20:07:38.185Z - error: [twitterData]  MongoError: cursor id 171023944024 not found
          at /opt/mongocbmigration/node_modules/mongodb-core/lib/connection/pool.js:598:61
          at authenticateStragglers (/opt/mongocbmigration/node_modules/mongodb-core/lib/connection/pool.js:516:16)
          at Connection.messageHandler (/opt/mongocbmigration/node_modules/mongodb-core/lib/connection/pool.js:552:5)
          at emitMessageHandler (/opt/mongocbmigration/node_modules/mongodb-core/lib/connection/connection.js:309:10)
          at Socket.<anonymous> (/opt/mongocbmigration/node_modules/mongodb-core/lib/connection/connection.js:452:17)
          at Socket.emit (events.js:180:13)
          at addChunk (_stream_readable.js:274:12)
          at readableAddChunk (_stream_readable.js:261:11)
          at Socket.Readable.push (_stream_readable.js:218:10)
          at TCP.onread (net.js:581:20)

       

       

      The code I use for the streams function on the cursor is below. 

       

       

      	mongo_db.collection(config.collection,async function(err,coll){
       
       
      		coll.find(config.mongo_query).count(function(e,coll_docs_count){
      			total_collection_count = coll_docs_count;
      			auditLogger.info("total collection document count for " + config.collection + " is " + coll_docs_count);
      			auditLogger.info("mongo query is  " + config.mongo_query);
      			var stream = coll.find(config.mongo_query,config.mongo_projection).stream();
      		    stream.once('end',  async function() {
      		    	  
      		    	
      		    	  logger.info("Total inserted docs for collection " + config.collection + "  before stream close is " + count)
      			    	if(count === coll_docs_count){
      			    		  logger.info("Total inserted docs in " + config.collection + " is " + count + ". Closing all DB connections! Huzzah!");
       
       
      			    	}
       
       
      		    });
      		    
      		    stream.on('error', function (err) {
      		        auditLogger.error("action': 'log','param': 'mongodb streaming error");
      		        auditLogger.error(err)
      		    });
      		    
      		    stream.on('data',function(data){}).pipe(stage1).pipe(stage2).pipe(stage3).pipe(stage4).pipe(insertDataIntoDB);
      		    
      		})
      	})
      

       

      This is 100% reproducible. Just need to create a stream on a cursor reading a large collection.

       

       

        Attachments

          Activity

            People

            Assignee:
            Unassigned
            Reporter:
            satish.anupindi@aexp.com Satish Anupindi
            Participants:
            Votes:
            0 Vote for this issue
            Watchers:
            1 Start watching this issue

              Dates

              Created:
              Updated:
              Resolved: