[SERVER-6264] How to use mongoexport with a query? Created: 01/Jul/12 Updated: 15/Aug/12 Resolved: 01/Jul/12 |
|
| Status: | Closed |
| Project: | Core Server |
| Component/s: | Querying |
| Affects Version/s: | 2.0.6 |
| Fix Version/s: | None |
| Type: | Question | Priority: | Major - P3 |
| Reporter: | Melanie Galloway | Assignee: | Unassigned |
| Resolution: | Done | Votes: | 0 |
| Labels: | mongoexport | ||
| Remaining Estimate: | Not Specified | ||
| Time Spent: | Not Specified | ||
| Original Estimate: | Not Specified | ||
| Environment: |
Linus OS |
||
| Participants: |
| Description |
|
I'm trying to output the results of a query as a csv file using mongoexport. The query runs fine, here's a sample of the output: > db.discussions.find({}, {subject:1,number_of_comments:1}) { "_id" : ObjectId("4d094991516bcb9029000002"), "number_of_comments" : 1, "subject" : "APH10132614" } { "_id" : ObjectId("4d094992516bcb9029000004"), "number_of_comments" : 5, "subject" : "APH10112880" } { "_id" : ObjectId("4d094992516bcb9029000006"), "number_of_comments" : 1, "subject" : "APH10067253" } { "_id" : ObjectId("4d094992516bcb9029000008"), "number_of_comments" : 0, "subject" : "APH10042670" } { "_id" : ObjectId("4d094992516bcb902900000a"), "number_of_comments" : 25, "subject" : "APH10042299" }I figured this would be an easy one to output as csv since it's already in that sort of format, but this is what happens when I attempt to in mongoexport: [911]galloway@host1 ~/mongodb-linux-x86_64-2.0.6/bin> ./mongoexport -d sellers-ph-production_190512 -c discussions -q ' {subject:1,number_of_comments:1}' -f _id,subject,number_of_comments --csv >startable3.csv My best guess is that I'm missing some sort of syntax when writing the query portion here, but I haven't seen many examples of this so I don't know what is wrong. Any advice is appreciated! |
| Comments |
| Comment by Scott Hernandez (Inactive) [ 01/Jul/12 ] |
|
Answered on mongodb-user, which is a much better place to ask. If you have any features or find any bugs please use jira for that. |