Critical - P2
RAM: 4GB, 8GB & 32GB
mongodump version: r3.2.11
git version: 45418a84270bd822db0d6d0c37a0264efb0e86d2
Go version: go1.7
MongoDB 3.2.10 replica set, with authentication and SSL enabled.
Data Size: ±8GB
Platforms 2016-11-21, Platforms 2017-01-23
A mongodump with compression to an output directory (not archive) rapidly consumes all available RAM and is terminated by OOMkiller. Eg. with 8GB data size, mongodump reached ±24GB RSS before being terminated. A few times, there was a stacktrace, which I've attached. Mostly, it's OOMkiller.
- Without the gzip option, the dump completes successfully.
- `mongodump --gzip --archive=/var/backups/mongodb/test/` completes successfully.
The exact command line arguments being supplied: