[SERVER-7938] v8 running out of memory when compiling large script Created: 14/Dec/12 Updated: 14/Apr/16 Resolved: 15/Jan/16 |
|
| Status: | Closed |
| Project: | Core Server |
| Component/s: | JavaScript |
| Affects Version/s: | 2.3.2 |
| Fix Version/s: | None |
| Type: | Bug | Priority: | Major - P3 |
| Reporter: | hari.khalsa@10gen.com | Assignee: | DO NOT USE - Backlog - Platform Team |
| Resolution: | Done | Votes: | 0 |
| Labels: | None | ||
| Remaining Estimate: | Not Specified | ||
| Time Spent: | Not Specified | ||
| Original Estimate: | Not Specified | ||
| Attachments: |
|
| Backwards Compatibility: | Fully Compatible |
| Operating System: | ALL |
| Participants: |
| Description |
|
I have a 27MB file that basically has variations on this:
If I run this normally, I get this:
If I compile with --usesm, it works/doesn't crash. |
| Comments |
| Comment by J Rassi [ 22/Jan/13 ] | |
|
Looks like this can also be reproduced by running a command that returns a large document. Will probably commonly show up for users testing text search. See | |
| Comment by Ben Becker [ 26/Dec/12 ] | |
|
We may be able to adjust the max_old_space_size according to http://code.google.com/p/v8/issues/detail?id=847, but I'm not confident that will fix this issue (as this happens during compilation). It seems like V8 could abide by IgnoreOutOfMemoryException() for this case, but may require a change to v8 source (and we'd need to verify the isolate can or cannot be reused in this case). | |
| Comment by hari.khalsa@10gen.com [ 14/Dec/12 ] | |
|
Well, not just twice as much memory...at least twice, since it crashes before it even finishes I do think it's reasonable to have memory restrictions and shut down cleanly and so on but maybe it can be made clearer to the user in some way? | |
| Comment by Ben Becker [ 14/Dec/12 ] | |
|
Yeah, twice the memory isn't awesome, but the only thing I can think of is to cleanly shut down v8 on oom. This could also be related to additional mem leaks in v8 that haven't been identified/fixed yet. There appears to be one related to lots of arrays, but I haven't attempted to reproduce since fixing the main GC issues. I'll dig further. | |
| Comment by Tad Marshall [ 14/Dec/12 ] | |
|
I think it's a regression if something worked with SpiderMonkey and doesn't work with V8, unless we conclude that it should not have worked with SpiderMonkey for some reason (and so working with SpiderMonkey was actually a "bug"). But we may not have the ability to make V8 consume the same amount of RAM as SpiderMonkey ... it's a different engine. | |
| Comment by hari.khalsa@10gen.com [ 14/Dec/12 ] | |
|
I don't actually need the script I'm running to have so many global refs. I just didn't expect it to crash I'm concerned that V8 uses so much more memory than SM. I can keep this many global refs with SM but not V8...does this matter or not? | |
| Comment by Ben Becker [ 14/Dec/12 ] | |
|
One thing to note – the 64mb limit is only for reclaimable space; not active memory. I believe we can limit either though. | |
| Comment by Ben Becker [ 14/Dec/12 ] | |
|
tad, I'm not sure the 27mb is as relevant as the fact that all variables are globally rooted. This would have to account for v8's internal memory; not just the JS syntax. Raising the limit sounds good, but I don't want v8 to consume too much physical memory or it could compete heavily with mongod (possibly degrading normal db performance). That said, I'm still using the 64mb value that was set before we properly accounted for bson objects. We could implement a simple resource manager (e.g. to account for number of concurrent isolates, available physical memory, etc.). I also think we shouldn't die in this case – I think we should just make the script fail. hari.khalsa@10gen.com, thanks – IIRC 1G is actually a hard limit in v8 (by design). Not sure we can do anything about this. Can you try modifying the script so each variable is overwritten with undefined or 'delete'd, or simply do this in a for loop? | |
| Comment by hari.khalsa@10gen.com [ 14/Dec/12 ] | |
|
The SM build used about 400MB according to 'top' on my computer. The V8 build got up to about 1G+ then died. | |
| Comment by Tad Marshall [ 14/Dec/12 ] | |
|
If we're hitting the OOM at 27 MB instead of closer to 64 MB, is it possible that we're double-allocating or double-accounting? Also, I suspect that 64 MB may be sort of low for the kinds of things that people might want to do with JavaScript. I wonder if we should raise the limit or perhaps add a setParameter or something to allow for larger workspaces. The shell in particular could probably use a much higher limit. | |
| Comment by Ben Becker [ 14/Dec/12 ] | |
|
This actually raises another interesting point. I think we can cleanly recover from this state in v8 now. This would just cause the script to fail instead of the whole process. | |
| Comment by Ben Becker [ 14/Dec/12 ] | |
|
Hi Hari, This script creates a unique handle for every insert, and doesn't appear to release that handle (using 'delete' or simply overwriting the variable with null/undefined), so the GC can't run. I don't think we ever configured SM's GC to obey a limit. One question; how much memory did the spidermonkey mongo client use around the end of the test? We can easily raise this limit in v8 if we want to support scripts like this. | |
| Comment by Tad Marshall [ 14/Dec/12 ] | |
|
The message
should be fixed by |