[CSHARP-1510] GetMore throws an InvalidOperationException under certain conditions Created: 15/Dec/15 Updated: 02/Apr/16 Resolved: 07/Jan/16 |
|
| Status: | Closed |
| Project: | C# Driver |
| Component/s: | Operations |
| Affects Version/s: | 2.2 |
| Fix Version/s: | 2.2.1 |
| Type: | Bug | Priority: | Major - P3 |
| Reporter: | Bret Ferrier | Assignee: | Robert Stam |
| Resolution: | Done | Votes: | 0 |
| Labels: | regression | ||
| Remaining Estimate: | Not Specified | ||
| Time Spent: | Not Specified | ||
| Original Estimate: | Not Specified | ||
| Environment: |
Windows (Azure website and Win 10) .Net 4.6 |
||
| Description |
|
I have a query that is working against the 2.0 driver and have upgraded the driver to 2.2 and not updated the query and am getting the error below in some situations. I had to revert production to the 2.0 driver to get this to go away. Below is the Error-
|
| Comments |
| Comment by Githook User [ 07/Jan/16 ] | ||||||||||||||||||||||||||||||||||||||||||
|
Author: {u'username': u'rstam', u'name': u'rstam', u'email': u'robert@robertstam.org'}Message: | ||||||||||||||||||||||||||||||||||||||||||
| Comment by Githook User [ 15/Dec/15 ] | ||||||||||||||||||||||||||||||||||||||||||
|
Author: {u'username': u'rstam', u'name': u'rstam', u'email': u'robert@robertstam.org'}Message: | ||||||||||||||||||||||||||||||||||||||||||
| Comment by Robert Stam [ 15/Dec/15 ] | ||||||||||||||||||||||||||||||||||||||||||
|
Good point. We'll review the documentation. BatchSize is in units of "documents". It is the maximum number of documents the server could return in a single batch. Note: the server takes this as a hint and could return fewer documents. | ||||||||||||||||||||||||||||||||||||||||||
| Comment by Bret Ferrier [ 15/Dec/15 ] | ||||||||||||||||||||||||||||||||||||||||||
|
So looking at the FindOptionsBase there is no documentation about what the units of the BatchSize are. myIfindFluent.Options.BatchSize = 4; Is that 4KB or just 4 Bytes? I am assuming that setting it at 4MB is the best option till the fix is pushed out. Also it seems that BatchSize is used in too many place as the documentation below uses it differently when talking about a Cursor and how many documents are returned and not size of the batch. | ||||||||||||||||||||||||||||||||||||||||||
| Comment by Robert Stam [ 15/Dec/15 ] | ||||||||||||||||||||||||||||||||||||||||||
|
The exact value for limit that will trigger this bug depends on the document sizes. The server returns batches of about 4MB in size, so whatever value for limit includes sufficient documents to exceed the 4MB batch size will trigger this bug. You can work around this by setting the BatchSize to any non-null value. Avoid very small batch sizes as that would result in excessive round trips to the server. | ||||||||||||||||||||||||||||||||||||||||||
| Comment by Bret Ferrier [ 15/Dec/15 ] | ||||||||||||||||||||||||||||||||||||||||||
|
Just did a test and found that I am unable to get a batch size of 600 but can get a batch size of 500 so the magic number is somewhere in between. Not sure if it is stagnant or dependent on the query. | ||||||||||||||||||||||||||||||||||||||||||
| Comment by Bret Ferrier [ 15/Dec/15 ] | ||||||||||||||||||||||||||||||||||||||||||
|
Robert- What kind of Batch size is "too large" It is currently skipping 0 and setting a Batch size of 750 and is connecting to a 3.0 instance of Mongo. | ||||||||||||||||||||||||||||||||||||||||||
| Comment by Bret Ferrier [ 15/Dec/15 ] | ||||||||||||||||||||||||||||||||||||||||||
|
If I change the code to use ToList as opposed to ToListAsync I get the following error
If I render out the Filter and the Sort Below is what I get
Sort
| ||||||||||||||||||||||||||||||||||||||||||
| Comment by Robert Stam [ 15/Dec/15 ] | ||||||||||||||||||||||||||||||||||||||||||
|
Looks like the most likely way to encounter this is to use a very large limit with a default batch size of null. The limit has to be large enough that the result set doesn't fit in the initial reply from the server. Also, this should only occur on server versions < 3.2. A workaround for now is to set a value for the batch size. | ||||||||||||||||||||||||||||||||||||||||||
| Comment by Robert Stam [ 15/Dec/15 ] | ||||||||||||||||||||||||||||||||||||||||||
|
Thanks for reporting this. I will attempt to reproduce this. If you can provide some code to reproduce this that would be very helpful. |