[SERVER-8047] Increase the limit for JsonArray that can be imported with mongoimport Created: 28/Dec/12 Updated: 01/Apr/13 Resolved: 31/Dec/12 |
|
| Status: | Closed |
| Project: | Core Server |
| Component/s: | Tools |
| Affects Version/s: | 2.2.0 |
| Fix Version/s: | None |
| Type: | Improvement | Priority: | Major - P3 |
| Reporter: | pratik dalal | Assignee: | Unassigned |
| Resolution: | Duplicate | Votes: | 0 |
| Labels: | None | ||
| Remaining Estimate: | Not Specified | ||
| Time Spent: | Not Specified | ||
| Original Estimate: | Not Specified | ||
| Issue Links: |
|
||||||||
| Participants: | |||||||||
| Description |
|
Creating this JIRA as a follow up to I tried mongoimport to import a jsonArray file but the file is > 16 MB (its about ~200MB) and hence the import fails: mongoimport -d mydatabase -c mycollection --jsonArray myfile Is there already a workaround for importing such huge json arrays so that the imported data can be queried as well later, as opposed to simply storing huge chunks of data via GridFS? Thank You. |
| Comments |
| Comment by Eliot Horowitz (Inactive) [ 31/Dec/12 ] |
|
An array that size is not supported. |
| Comment by pratik dalal [ 31/Dec/12 ] |
|
Yes, size of 1 single file (json array) is > 200mb. How do I import data from this single file into mongodb for querying later? Thank You. Sent from my Android phone using TouchDown (www.nitrodesk.com) |
| Comment by Eliot Horowitz (Inactive) [ 30/Dec/12 ] |
|
Do you mean 1 document more than 200mb? |