[SERVER-35106] cursor iteration freeze after the 101 docs (1st batch + 1) Created: 21/May/18 Updated: 18/May/20 Resolved: 18/Jun/18 |
|
| Status: | Closed |
| Project: | Core Server |
| Component/s: | Querying |
| Affects Version/s: | None |
| Fix Version/s: | None |
| Type: | Bug | Priority: | Major - P3 |
| Reporter: | bo yuan | Assignee: | Nick Brewer |
| Resolution: | Cannot Reproduce | Votes: | 0 |
| Labels: | None | ||
| Remaining Estimate: | Not Specified | ||
| Time Spent: | Not Specified | ||
| Original Estimate: | Not Specified | ||
| Issue Links: |
|
||||||||
| Operating System: | ALL | ||||||||
| Steps To Reproduce: |
|
||||||||
| Participants: | |||||||||
| Description |
|
I got an cursor from the simplest query possible (b.some_collection.find()), and if the collection is big, the iteration on the cursor will pause after the first 101 docs are iterated (the 1st batch plus 1 more doc). The mongod process is doing a lot I/O reading. It looks like it's loading the whole data in. That will make my code freeze for a long time for large collections. The same thing happens no matter I do the query via pymongo or in the mongo shell. |
| Comments |
| Comment by Nick Brewer [ 18/Jun/18 ] |
|
Hi, The numYield field does not represent how many results are being returned from a database query, it instead represents the number of times that the operation yielded in order to allow other operations to complete. This is fully explained in the database profiler documentation, and in the concurrency documentation. From your responses, I do not see anything to indicate a bug in the MongoDB server. For MongoDB-related support discussion please post on the mongodb-user group or Stack Overflow with the mongodb tag. Nick |
| Comment by bo yuan [ 22/May/18 ] |
|
I will create an index on the filed for the filter, but right now the numYields is weird. |
| Comment by bo yuan [ 22/May/18 ] |
|
Mr Fernandez, I'm using v3.6.0 Actually I cannot reproduce that situation today. Now with a simple find(), the iteration still freeze after doc 101, but the iteration can go on after a reasonable time now. However, here's another issue. Now i'm doing a find() with a filter, for which quite many docs at the beginning of the collection do NOT satisfy, so understandably the iteration will wait for a long time before the first doc is yielded. BUT if i check the db with db.currentOp(), the mongo shell will show an increasing big numYields number for that op, while the cursor is not at all yielding any document. That is quite misleading. so why is that ?
Thanks a lot! |
| Comment by Ramon Fernandez Marina [ 21/May/18 ] |
|
rambo_yuanbo@outlook.com, what version of MongoDB are you using? Have you checked the logs for information and the output of db.currentOp() when this issue happens? Also if you have a reproducer that shows the behavior you describe if would be of great help if you could share it with us. Regards, |
| Comment by bo yuan [ 21/May/18 ] |
|
sorry, i cannot find how to edit the issue. After browsing other issues, i find i can go around the problem like : db.collection.find().batchSize(100). BUT it's still weird enough that the cursor will freeze and try to load the whole data if i don't explicitly ask for a batchSize. |