[CSHARP-143] InsertBatch Exception WHEN the item of object too large Created: 28/Dec/10 Updated: 02/Apr/15 Resolved: 28/Dec/10 |
|
| Status: | Closed |
| Project: | C# Driver |
| Component/s: | None |
| Affects Version/s: | 0.9 |
| Fix Version/s: | 1.0 |
| Type: | Bug | Priority: | Major - P3 |
| Reporter: | xuqing | Assignee: | Robert Stam |
| Resolution: | Done | Votes: | 0 |
| Labels: | None | ||
| Remaining Estimate: | Not Specified | ||
| Time Spent: | Not Specified | ||
| Original Estimate: | Not Specified | ||
| Environment: |
VS2010 |
||
| Description |
|
I have a List<> object in C# with about 100000 items,When I use the InsertBatch<> method to Insert these data into mongodb,It got exception and data can't insert into db. |
| Comments |
| Comment by Testo [ 28/Dec/10 ] |
|
the limit is 16MB in 1.6 and 48MB in 1.7 but it includes the entire message length not just the data. struct MsgHeader { |
| Comment by Robert Stam [ 28/Dec/10 ] |
|
First there was a typo in the definition of MongoDefaults.MaxMessageLength, so the value was higher than 16MB. Second, the server doesn't seem to like 16MB either, so I reduced the maximum message length to 12MB (at least for now). |
| Comment by Robert Stam [ 28/Dec/10 ] |
|
Excellent information. Thanks. I will try to reproduce. |
| Comment by xuqing [ 28/Dec/10 ] |
|
StackTrace: At System.Net.Sockets.NetworkStream.Write(Byte[] buffer, Int32 offset, Int32 size) At MongoDB.Bson.IO.BsonBuffer.WriteTo(Stream stream) |
| Comment by xuqing [ 28/Dec/10 ] |
|
And Out Put of Program is Below: [11:56:05.096]?Init |
| Comment by xuqing [ 28/Dec/10 ] |
|
My Program is Below: using System; using MongoDB.Driver; namespace MongoTest MongoServer _dbServer = MongoServer.Create(); try { if ((_dbServer.State == MongoServerState.Disconnected) || (_dbServer.State == MongoServerState.None)) { System.Console.WriteLine(string.Format("[{0} ]?Connect To Mongo", DateTime.Now.ToString("HH:mm:ss.fff"))); _dbServer.Connect(); System.Console.WriteLine(string.Format("[ {0}]?Get Document tTest", DateTime.Now.ToString("HH:mm:ss.fff")));MongoDatabase _db = _dbServer.GetDatabase("test"); if (_db.CollectionExists("tTest")) { _db.DropCollection("tTest"); } MongoCollection<MongoDB.Bson.BsonDocument> _document = _db.GetCollection("tTest"); List<MongoDB.Bson.BsonDocument> _insertDoc = new List<BsonDocument>(); int _iRowsCount = 100000; System.Console.WriteLine(string.Format("[{0} ]?Create Data", DateTime.Now.ToString("HH:mm:ss.fff"), _iRowsCount)); TimeSpan _ts = new TimeSpan(0); for (int i = 0; i < _iRowsCount; i++) _eleList.Add(new BsonElement("dtInsert", (DateTime.Now.ToUniversalTime().Ticks - 621355968000000000) / 10000000)); _eleList.Add(new BsonElement("iGameId", i)); _eleList.Add(new BsonElement("sProcGuid", System.Guid.NewGuid().ToString())); _eleList.Add(new BsonElement("dtSniff", (DateTime.Now.ToUniversalTime().Ticks - 621355968000000000) / 10000000)); _eleList.Add(new BsonElement("sCardNo", "")); _eleList.Add(new BsonElement("sSex", "Un Known")); _eleList.Add(new BsonElement("iAge", 0)); _eleList.Add(new BsonElement("iState", 1)); _insertDoc.Add(new BsonDocument(_eleList)); } System.Console.WriteLine(string.Format("[{0} ]?Data Ready?Begin Write {1} Data", DateTime.Now.ToString("HH:mm:ss.fff"), _iRowsCount));_dtStart = DateTime.Now; _document.InsertBatch(_insertDoc,SafeMode.True); _dtEnd = DateTime.Now; _ts = _dtEnd.Subtract(_dtStart); System.Console.WriteLine(string.Format("[{0}]?{1} Write Success", DateTime.Now.ToString("HH:mm:ss.fff"), _iRowsCount)); System.Console.WriteLine(string.Format("Done?Take time? {0}H{1}M{2}S{3}MS", _ts.Hours, _ts.Minutes, _ts.Seconds, _ts.Milliseconds));} } catch (Exception ex) { System.Console.WriteLine(string.Format("Faile?Exception?{0} ", ex.Message)); } |
| Comment by Robert Stam [ 28/Dec/10 ] |
|
Can you provide an error message and a stack trace? And perhaps a description of the document you are trying to save. Do you mean you have a list of 100000 documents, or a document that has an internal list of 100000 values? There is a maximum size to a document, you may just be reaching this limit. The limit is currently 4MB but will be increasing. |