[SERVER-7355] mongoimport cannot import a dump in json array format larger than 16MB Created: 15/Oct/12 Updated: 11/Jul/16 Resolved: 27/Apr/13 |
|
| Status: | Closed |
| Project: | Core Server |
| Component/s: | Tools |
| Affects Version/s: | 2.2.0 |
| Fix Version/s: | 2.5.0 |
| Type: | Bug | Priority: | Major - P3 |
| Reporter: | edgar 88 | Assignee: | Shaun Verch |
| Resolution: | Done | Votes: | 0 |
| Labels: | mongoimport | ||
| Remaining Estimate: | Not Specified | ||
| Time Spent: | Not Specified | ||
| Original Estimate: | Not Specified | ||
| Issue Links: |
|
||||||||||||
| Operating System: | Windows | ||||||||||||
| Participants: | |||||||||||||
| Description |
|
The following fails on 2.2.0. Seems to be related to https://jira.mongodb.org/browse/SERVER-6498?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=174809#comment-174809 C:\>mongodb\bin\mongoimport.exe --collection collection --file file.json |
| Comments |
| Comment by Ben Kiefer [ 03/Mar/14 ] |
|
Shaun, sorry it's taken so long to get back to you. Here is the information you requested. One thing to note is that we are using 2.4.6-rc1 for our export, and do not have an easy way to upgrade those yet. I was using 2.6.0rc for the import. Here are the commands you requested. The problem line turned out to be 18339991 bytes long. However, it was successfully exported using the command above. It definitely looks like the issue is related to the one you linked above. |
| Comment by Shaun Verch [ 25/Feb/14 ] |
|
Hi Ben, Thanks for your report. There are actually two related issues. The original issue behind this ticket involved importing a collection that was exported using "- However, some of the internal buffers are still 16MB, which means if individual documents are too large, we can still see this error. See https://jira.mongodb.org/browse/SERVER-12884 for more details. To confirm that this is actually your issue, could you provide the following: 1. The command line flags you used with mongoexport and mongoimport. 2. More details about the sizes of your documents. For example, if you did not use "--jsonArray", the size (in bytes) of each line of the file is the information that is relevant here. Thanks, |
| Comment by Ben Kiefer [ 25/Feb/14 ] |
|
I'm still getting this error on 2.6.0rc on windows. exception:read error, or input line too long (max length: 16777216) |
| Comment by mickdelaney [ 04/Dec/13 ] |
|
i'm getting this error on a json collection import to mongo running 2.5.4 on windows. call %MONGO_IMPORT% -h %SERVER% -d %DATABASE_NAME% -u %DATABASE_USER% -p %DATABASE_PWD% -c mycollection --drop --file mycollection.json connected to: localhost:27017 |
| Comment by auto [ 27/Apr/13 ] |
|
Author: {u'date': u'2013-04-19T16:21:41Z', u'name': u'Shaun Verch', u'email': u'shaun.verch@10gen.com'}Message: |
| Comment by auto [ 27/Apr/13 ] |
|
Author: {u'date': u'2013-04-19T16:12:05Z', u'name': u'Shaun Verch', u'email': u'shaun.verch@10gen.com'}Message: |
| Comment by auto [ 27/Apr/13 ] |
|
Author: {u'date': u'2013-04-19T15:52:13Z', u'name': u'Shaun Verch', u'email': u'shaun.verch@10gen.com'}Message: |
| Comment by auto [ 27/Apr/13 ] |
|
Author: {u'date': u'2013-04-19T15:48:10Z', u'name': u'Shaun Verch', u'email': u'shaun.verch@10gen.com'}Message: |
| Comment by auto [ 27/Apr/13 ] |
|
Author: {u'date': u'2013-04-17T20:42:27Z', u'name': u'Shaun Verch', u'email': u'shaun.verch@10gen.com'}Message: |
| Comment by edgar 88 [ 17/Oct/12 ] |
|
Hi Ben, The invididual JSON-objects in the database are not exceeding 16 mb, they are twitter messages captured from the twitter stream. I think the problem lies in that the objects are not line delimited (perhaps because I used the option --jsonarray). The first 10.000 characters of the file (all on the first line) are pasted here: http://pastebin.com/ehVZjcnD Note |
| Comment by Ben Becker [ 17/Oct/12 ] |
|
Hi Edgar, It sounds like mongoexport with --jsonarray produced a line longer than 16mb. There are cases where the JSON format can be larger than BSON, which we probably need to handle in mongoexport. That said, in generally these tools are only used for importing/exporting data on MongoDB, while mongodump and mongorestore will preserve an exact copy of the data between the dump and restore process. This seems to be working as expected, but would it be possible to proved the long line that produces this error to confirm? Thanks, |
| Comment by edgar 88 [ 16/Oct/12 ] |
|
importing with the option --jsonArray gives: Tue Oct 16 09:17:19 exception:JSONArray file too large |
| Comment by edgar 88 [ 16/Oct/12 ] |
|
When I try to read the file created by mongoexport via a BufferedReader in Java, I get an out of memory error. It seems to me that the JSON file is not line delimited and therefore the error "input line too long" is thrown by mongodb |
| Comment by edgar 88 [ 15/Oct/12 ] |
|
The collection file.json was created just before the import, using mongoexport --collection collection --file file.json --jsonarray. |