Mon Dec 8 06:34:08 [Balancer] caught exception while doing balance: distributed lock balancer/atc005:37017:1415292838:1804289383 had errors communicating with individual server mongo-livelog-b-2:27019 :: caused by :: DBClientBase::findN: transport error: mongo-livelog-a-2:27019 ns: config.locks query: { _id: "balancer" } Mon Dec 8 06:34:08 [Balancer] warning: distributed lock 'balancer/atc005:37017:1415292838:1804289383 did not propagate properly. :: caused by :: 8017 update not consistent ns: config.locks query: { _id: "balancer", state: 0, ts: ObjectId('5485b6dcc16e738217f43163') } update: { $set: { state: 1, who: "atc005:37017:1415292838:1804289383:Balancer:1169620192", process: "atc005:37017:1415292838:1804289383", when: new Date(1418049248769), why: "doing balance round", ts: ObjectId('5485b6e0903bf7e88aaa2276') } } gle1: { updatedExisting: true, n: 1, connectionId: 10, waited: 33, err: null, ok: 1.0 } gle2: { updatedExisting: false, n: 0, connectionId: 1601504, waited: 10, err: null, ok: 1.0 } Mon Dec 8 06:34:08 [Balancer] distributed lock 'balancer/handprocessor003:37017:1417702041:1804289383' was not acquired. Mon Dec 8 06:34:08 [Balancer] could not acquire lock 'balancer/handprocessor003:37017:1417702041:1804289383' (another update won) Mon Dec 8 06:34:04 [Balancer] distributed lock 'balancer/celery009:37017:1411743445:1804289383' unlocked. Mon Dec 8 06:34:04 [Balancer] distributed lock 'balancer/celery009:37017:1411743445:1804289383' acquired, ts : 5485b6dcc16e738217f43163 Mon Dec 8 06:33:46 [Balancer] distributed lock 'balancer/persist031:37017:1415803447:1804289383' unlocked. Mon Dec 8 06:33:46 [Balancer] distributed lock 'balancer/persist031:37017:1415803447:1804289383' acquired, ts : 5485b6ca2373cfc96bd244bc Mon Dec 8 06:33:46 [LockPinger] handled late remove of old distributed lock with ts 5485b6b612aeba6251704c6c Mon Dec 8 06:33:46 [LockPinger] trying to delete 1 old lock entries for process persist027:37017:1401989364:1804289383 Mon Dec 8 06:33:45 [Balancer] distributed lock 'balancer/handprocessor001:37017:1417702049:1804289383' unlocked. Mon Dec 8 06:33:45 [Balancer] distributed lock 'balancer/handprocessor001:37017:1417702049:1804289383' acquired, ts : 5485b6c9a0bc80f775bfebfb Mon Dec 8 06:33:45 [Balancer] distributed lock 'balancer/persist030:37017:1415803442:1804289383' unlocked. Mon Dec 8 06:33:45 [Balancer] distributed lock 'balancer/persist030:37017:1415803442:1804289383' acquired, ts : 5485b6c93b6945bd1536a875 Mon Dec 8 06:33:45 [Balancer] distributed lock 'balancer/celery007:37017:1411826479:1804289383' unlocked. Mon Dec 8 06:33:45 [Balancer] distributed lock 'balancer/celery007:37017:1411826479:1804289383' acquired, ts : 5485b6c9782ba83693223c15 Mon Dec 8 06:33:44 [Balancer] distributed lock 'balancer/persist025:37017:1411736275:1804289383' was not acquired. Mon Dec 8 06:33:44 [Balancer] could not acquire lock 'balancer/persist025:37017:1411736275:1804289383' (another update won) Mon Dec 8 06:33:44 [Balancer] distributed lock 'balancer/persist027:37017:1401989364:1804289383' unlocked. Mon Dec 8 06:33:44 [Balancer] distributed lock 'balancer/persist027:37017:1401989364:1804289383' acquired, ts : 5485b6c812aeba6251704c6e Mon Dec 8 06:33:44 [Balancer] distributed lock 'balancer/persist032:37017:1415803466:1804289383' was not acquired. Mon Dec 8 06:33:44 [Balancer] could not acquire lock 'balancer/persist032:37017:1415803466:1804289383' (another update won) Mon Dec 8 06:33:44 [Balancer] distributed lock 'balancer/persist029:37017:1415803043:1804289383' was not acquired. Mon Dec 8 06:33:44 [Balancer] could not acquire lock 'balancer/persist029:37017:1415803043:1804289383' (another update won) Mon Dec 8 06:33:44 [Balancer] distributed lock 'balancer/persist021:37017:1417712282:1804289383' was not acquired. Mon Dec 8 06:33:44 [Balancer] could not acquire lock 'balancer/persist021:37017:1417712282:1804289383' (another update won) Mon Dec 8 06:33:44 [Balancer] distributed lock 'balancer/admin:37017:1404422906:1804289383' was not acquired. Mon Dec 8 06:33:44 [Balancer] could not acquire lock 'balancer/admin:37017:1404422906:1804289383' (another update won) Mon Dec 8 06:33:44 [Balancer] distributed lock 'balancer/atc005:37017:1415292838:1804289383' was not acquired. Mon Dec 8 06:33:44 [Balancer] could not acquire lock 'balancer/atc005:37017:1415292838:1804289383' (another update won) Mon Dec 8 06:33:40 [Balancer] distributed lock 'balancer/celery009:37017:1411743445:1804289383' unlocked. Mon Dec 8 06:33:40 [Balancer] distributed lock 'balancer/celery009:37017:1411743445:1804289383' acquired, ts : 5485b6c4c16e738217f43162 Mon Dec 8 06:33:40 [Balancer] distributed lock 'balancer/persist031:37017:1415803447:1804289383' unlocked. Mon Dec 8 06:33:39 [Balancer] distributed lock 'balancer/handprocessor001:37017:1417702049:1804289383' unlocked. Mon Dec 8 06:33:39 [Balancer] distributed lock 'balancer/handprocessor001:37017:1417702049:1804289383' acquired, ts : 5485b6c3a0bc80f775bfebfa Mon Dec 8 06:33:39 [Balancer] distributed lock 'balancer/handprocessor002:37017:1417702043:1804289383' was not acquired. Mon Dec 8 06:33:39 [Balancer] could not acquire lock 'balancer/handprocessor002:37017:1417702043:1804289383' (another update won) Mon Dec 8 06:33:39 [Balancer] distributed lock 'balancer/persist030:37017:1415803442:1804289383' unlocked. Mon Dec 8 06:33:39 [Balancer] distributed lock 'balancer/persist030:37017:1415803442:1804289383' acquired, ts : 5485b6c33b6945bd1536a874 Mon Dec 8 06:33:39 [Balancer] distributed lock 'balancer/persist031:37017:1415803447:1804289383' acquired, ts : 5485b6c32373cfc96bd244bb Mon Dec 8 06:33:39 [Balancer] distributed lock 'balancer/celery007:37017:1411826479:1804289383' unlocked. Mon Dec 8 06:33:39 [Balancer] distributed lock 'balancer/celery007:37017:1411826479:1804289383' acquired, ts : 5485b6c2782ba83693223c14 Mon Dec 8 06:33:38 [Balancer] distributed lock 'balancer/persist032:37017:1415803466:1804289383' was not acquired. Mon Dec 8 06:33:38 [Balancer] could not acquire lock 'balancer/persist032:37017:1415803466:1804289383' (another update won) Mon Dec 8 06:33:38 [Balancer] distributed lock 'balancer/handprocessor003:37017:1417702041:1804289383' was not acquired. Mon Dec 8 06:33:38 [Balancer] could not acquire lock 'balancer/handprocessor003:37017:1417702041:1804289383' (another update won) Mon Dec 8 06:33:38 [Balancer] distributed lock 'balancer/persist029:37017:1415803043:1804289383' unlocked. Mon Dec 8 06:33:38 [Balancer] distributed lock 'balancer/persist029:37017:1415803043:1804289383' acquired, ts : 5485b6c209fbef4954584042 Mon Dec 8 06:33:38 [Balancer] distributed lock 'balancer/persist021:37017:1417712282:1804289383' was not acquired. Mon Dec 8 06:33:38 [Balancer] could not acquire lock 'balancer/persist021:37017:1417712282:1804289383' (another update won) Mon Dec 8 06:33:38 [Balancer] distributed lock 'balancer/admin:37017:1404422906:1804289383' was not acquired. Mon Dec 8 06:33:38 [Balancer] could not acquire lock 'balancer/admin:37017:1404422906:1804289383' (another update won) Mon Dec 8 06:33:38 [Balancer] distributed lock 'balancer/atc005:37017:1415292838:1804289383' was not acquired. Mon Dec 8 06:33:38 [Balancer] could not acquire lock 'balancer/atc005:37017:1415292838:1804289383' (another update won) Mon Dec 8 06:33:38 [Balancer] distributed lock 'balancer/persist025:37017:1411736275:1804289383' was not acquired. Mon Dec 8 06:33:38 [Balancer] could not acquire lock 'balancer/persist025:37017:1411736275:1804289383' (another update won) Mon Dec 8 06:33:34 [Balancer] distributed lock 'balancer/celery009:37017:1411743445:1804289383' unlocked. Mon Dec 8 06:33:33 [Balancer] distributed lock 'balancer/celery008:37017:1412085153:1804289383' unlocked. Mon Dec 8 06:33:33 [Balancer] distributed lock 'balancer/celery008:37017:1412085153:1804289383' acquired, ts : 5485b6bddedaba9985300a66 Mon Dec 8 06:33:33 [Balancer] distributed lock 'balancer/persist026:37017:1401989330:1804289383' unlocked. Mon Dec 8 06:33:33 [Balancer] distributed lock 'balancer/persist026:37017:1401989330:1804289383' acquired, ts : 5485b6bda22378eafc69af0e Mon Dec 8 06:33:33 [Balancer] distributed lock 'balancer/celery009:37017:1411743445:1804289383' acquired, ts : 5485b6bdc16e738217f43161 Mon Dec 8 06:33:32 [Balancer] distributed lock 'balancer/persist027:37017:1401989364:1804289383' unlocked. Mon Dec 8 06:33:32 [Balancer] distributed lock 'balancer/persist027:37017:1401989364:1804289383' acquired, ts : 5485b6bc12aeba6251704c6d Mon Dec 8 06:33:32 [Balancer] distributed lock 'balancer/celery007:37017:1411826479:1804289383' unlocked. Mon Dec 8 06:33:32 [Balancer] distributed lock 'balancer/celery007:37017:1411826479:1804289383' acquired, ts : 5485b6bc782ba83693223c13 Mon Dec 8 06:33:32 [Balancer] distributed lock 'balancer/persist021:37017:1417712282:1804289383' unlocked. Mon Dec 8 06:33:32 [Balancer] distributed lock 'balancer/persist021:37017:1417712282:1804289383' acquired, ts : 5485b6bc57c9913cc0d2f1c8 Mon Dec 8 06:33:27 [Balancer] distributed lock 'balancer/handprocessor001:37017:1417702049:1804289383' was not acquired. Mon Dec 8 06:33:27 [Balancer] could not acquire lock 'balancer/handprocessor001:37017:1417702049:1804289383' (another update won) Mon Dec 8 06:33:27 [Balancer] distributed lock 'balancer/handprocessor002:37017:1417702043:1804289383' was not acquired. Mon Dec 8 06:33:27 [Balancer] could not acquire lock 'balancer/handprocessor002:37017:1417702043:1804289383' (another update won) Mon Dec 8 06:33:27 [LockPinger] handled late remove of old distributed lock with ts 5485b6a443559d4e44273810 Mon Dec 8 06:33:27 [LockPinger] trying to delete 1 old lock entries for process handprocessor002:37017:1417702043:1804289383 Mon Dec 8 06:33:27 [Balancer] distributed lock 'balancer/celery008:37017:1412085153:1804289383' unlocked. Mon Dec 8 06:33:27 [Balancer] distributed lock 'balancer/celery008:37017:1412085153:1804289383' acquired, ts : 5485b6b7dedaba9985300a65 Mon Dec 8 06:33:27 [Balancer] distributed lock 'balancer/celery009:37017:1411743445:1804289383' unlocked. Mon Dec 8 06:33:27 [Balancer] distributed lock 'balancer/celery009:37017:1411743445:1804289383' acquired, ts : 5485b6b7c16e738217f43160 Mon Dec 8 06:33:27 [Balancer] distributed lock 'balancer/persist028:37017:1415803036:1804289383' unlocked. Mon Dec 8 06:33:27 [Balancer] distributed lock 'balancer/persist028:37017:1415803036:1804289383' acquired, ts : 5485b6b73fead8ae6b95b66b Mon Dec 8 06:33:27 [Balancer] distributed lock 'balancer/persist031:37017:1415803447:1804289383' was not acquired. Mon Dec 8 06:33:27 [Balancer] could not acquire lock 'balancer/persist031:37017:1415803447:1804289383' (another update won) Mon Dec 8 06:33:26 [Balancer] distributed lock 'balancer/persist027:37017:1401989364:1804289383' was not acquired. Mon Dec 8 06:33:26 [Balancer] lock update lost, lock 'balancer/persist027:37017:1401989364:1804289383' not propagated. Mon Dec 8 06:33:26 [Balancer] warning: distributed lock 'balancer/persist027:37017:1401989364:1804289383 did not propagate properly. :: caused by :: 8017 update not consistent ns: config.locks query: { _id: "balancer", state: 0, ts: ObjectId('5485b6b12373cfc96bd244b9') } update: { $set: { state: 1, who: "persist027:37017:1401989364:1804289383:Balancer:52019520", process: "persist027:37017:1401989364:1804289383", when: new Date(1418049206287), why: "doing balance round", ts: ObjectId('5485b6b612aeba6251704c6c') } } gle1: { updatedExisting: false, n: 0, connectionId: 200, waited: 26, err: null, ok: 1.0 } gle2: { updatedExisting: true, n: 1, connectionId: 869, waited: 12, err: null, ok: 1.0 } Mon Dec 8 06:33:26 [Balancer] distributed lock 'balancer/persist032:37017:1415803466:1804289383' was not acquired. Mon Dec 8 06:33:26 [Balancer] could not acquire lock 'balancer/persist032:37017:1415803466:1804289383' (another update won) Mon Dec 8 06:33:26 [Balancer] distributed lock 'balancer/celery007:37017:1411826479:1804289383' unlocked. Mon Dec 8 06:33:26 [Balancer] distributed lock 'balancer/celery007:37017:1411826479:1804289383' acquired, ts : 5485b6b6782ba83693223c12 Mon Dec 8 06:33:26 [Balancer] lock update won, completing lock propagation for 'balancer/celery007:37017:1411826479:1804289383' Mon Dec 8 06:33:26 [Balancer] warning: distributed lock 'balancer/celery007:37017:1411826479:1804289383 did not propagate properly. :: caused by :: 8017 update not consistent ns: config.locks query: { _id: "balancer", state: 0, ts: ObjectId('5485b6b12373cfc96bd244b9') } update: { $set: { state: 1, who: "celery007:37017:1411826479:1804289383:Balancer:928963656", process: "celery007:37017:1411826479:1804289383", when: new Date(1418049206282), why: "doing balance round", ts: ObjectId('5485b6b6782ba83693223c12') } } gle1: { updatedExisting: true, n: 1, connectionId: 71, waited: 27, err: null, ok: 1.0 } gle2: { updatedExisting: false, n: 0, connectionId: 783, waited: 13, err: null, ok: 1.0 } Mon Dec 8 06:33:26 [Balancer] distributed lock 'balancer/handprocessor003:37017:1417702041:1804289383' was not acquired. /usr/bin/mongos(_ZN5mongo15printStackTraceERSo+0x21) [0x814d51] /usr/bin/mongos(_ZN5mongo11msgassertedEiPKc+0x99) [0x7ddbb9] /usr/bin/mongos() [0x7ddd3c] /usr/bin/mongos(_ZN5mongo15StaticShardInfo4findERKSs+0x358) [0x76ebd8] /usr/bin/mongos(_ZN5mongo5Shard5resetERKSs+0x34) [0x76a1e4] /usr/bin/mongos(_ZN5mongo17checkShardVersionEPNS_12DBClientBaseERKSsN5boost10shared_ptrIKNS_12ChunkManagerEEEbi+0x8ea) [0x77029a] /usr/bin/mongos(_ZN5mongo14VersionManager19checkShardVersionCBEPNS_15ShardConnectionEbi+0x6a) [0x7724ba] /usr/bin/mongos(_ZN5mongo15ShardConnection11_finishInitEv+0xfc) [0x77637c] /usr/bin/mongos(_ZN5mongo13ShardStrategy7_insertERKSsRSt6vectorINS_7BSONObjESaIS4_EERSt3mapIN5boost10shared_ptrIKNS_5ChunkEEES6_St4lessISD_ESaISt4pairIKSD_S6_EEEiRNS_7RequestERNS_9DbMessageEi+0x1c0) [0x78e510] /usr/bin/mongos(_ZN5mongo13ShardStrategy7writeOpEiRNS_7RequestE+0x555) [0x792f25] /usr/bin/mongos(_ZN5mongo7Request7processEi+0xe8) [0x760ba8] /usr/bin/mongos(_ZN5mongo21ShardedMessageHandler7processERNS_7MessageEPNS_21AbstractMessagingPortEPNS_9LastErrorE+0x71) [0x500dd1] /usr/bin/mongos(_ZN5mongo3pms9threadRunEPNS_13MessagingPortE+0x415) [0x802f35] /lib/x86_64-linux-gnu/libpthread.so.0(+0x7e9a) [0x7fc74d8fae9a] /lib/x86_64-linux-gnu/libc.so.6(clone+0x6d) [0x7fc74cc0e31d] Mon Dec 8 06:33:26 [Balancer] could not acquire lock 'balancer/handprocessor003:37017:1417702041:1804289383' (another update won) Mon Dec 8 06:33:26 [Balancer] distributed lock 'balancer/admin:37017:1404422906:1804289383' was not acquired. Mon Dec 8 06:33:26 [Balancer] could not acquire lock 'balancer/admin:37017:1404422906:1804289383' (another update won) Mon Dec 8 06:33:26 [Balancer] distributed lock 'balancer/atc005:37017:1415292838:1804289383' was not acquired. Mon Dec 8 06:33:26 [Balancer] could not acquire lock 'balancer/atc005:37017:1415292838:1804289383' (another update won) Mon Dec 8 06:33:26 [Balancer] distributed lock 'balancer/persist029:37017:1415803043:1804289383' was not acquired. Mon Dec 8 06:33:26 [Balancer] could not acquire lock 'balancer/persist029:37017:1415803043:1804289383' (another update won) Mon Dec 8 06:33:26 [Balancer] distributed lock 'balancer/persist025:37017:1411736275:1804289383' was not acquired. Mon Dec 8 06:33:26 [Balancer] could not acquire lock 'balancer/persist025:37017:1411736275:1804289383' (another update won) Mon Dec 8 06:33:21 [Balancer] distributed lock 'balancer/handprocessor001:37017:1417702049:1804289383' was not acquired. Mon Dec 8 06:33:21 [Balancer] could not acquire lock 'balancer/handprocessor001:37017:1417702049:1804289383' (another update won) Mon Dec 8 06:33:21 [Balancer] distributed lock 'balancer/handprocessor002:37017:1417702043:1804289383' was not acquired. Mon Dec 8 06:33:21 [Balancer] could not acquire lock 'balancer/handprocessor002:37017:1417702043:1804289383' (another update won) Mon Dec 8 06:33:21 [Balancer] distributed lock 'balancer/persist030:37017:1415803442:1804289383' was not acquired. Mon Dec 8 06:33:21 [Balancer] could not acquire lock 'balancer/persist030:37017:1415803442:1804289383' (another update won) Mon Dec 8 06:33:21 [Balancer] distributed lock 'balancer/persist028:37017:1415803036:1804289383' unlocked. Mon Dec 8 06:33:21 [Balancer] distributed lock 'balancer/persist028:37017:1415803036:1804289383' acquired, ts : 5485b6b03fead8ae6b95b66a Mon Dec 8 06:33:21 [Balancer] distributed lock 'balancer/persist031:37017:1415803447:1804289383' unlocked. Mon Dec 8 06:33:21 [Balancer] distributed lock 'balancer/persist031:37017:1415803447:1804289383' acquired, ts : 5485b6b12373cfc96bd244b9 Mon Dec 8 06:33:18 [Balancer] distributed lock 'balancer/persist027:37017:1401989364:1804289383' unlocked. Mon Dec 8 06:33:18 [Balancer] distributed lock 'balancer/persist027:37017:1401989364:1804289383' acquired, ts : 5485b6ae12aeba6251704c6b Mon Dec 8 06:33:16 [Balancer] distributed lock 'balancer/persist021:37017:1417712282:1804289383' unlocked. Mon Dec 8 06:33:16 [Balancer] distributed lock 'balancer/persist021:37017:1417712282:1804289383' acquired, ts : 5485b6ac57c9913cc0d2f1c7 Mon Dec 8 06:33:16 [Balancer] distributed lock 'balancer/persist025:37017:1411736275:1804289383' was not acquired. Mon Dec 8 06:33:16 [Balancer] could not acquire lock 'balancer/persist025:37017:1411736275:1804289383' (another update won) Mon Dec 8 06:33:16 [Balancer] distributed lock 'balancer/admin:37017:1404422906:1804289383' unlocked. Mon Dec 8 06:33:16 [Balancer] distributed lock 'balancer/admin:37017:1404422906:1804289383' acquired, ts : 5485b6acbe87cb7e75ea9d63 Mon Dec 8 06:33:16 [Balancer] distributed lock 'balancer/persist032:37017:1415803466:1804289383' unlocked. Mon Dec 8 06:33:16 [Balancer] distributed lock 'balancer/persist032:37017:1415803466:1804289383' acquired, ts : 5485b6ab2c69c45f2a151754 Mon Dec 8 06:33:15 [Balancer] distributed lock 'balancer/handprocessor003:37017:1417702041:1804289383' unlocked. /usr/bin/mongos(_ZN5mongo15printStackTraceERSo+0x21) [0x814d51] /usr/bin/mongos(_ZN5mongo11msgassertedEiPKc+0x99) [0x7ddbb9] /usr/bin/mongos() [0x7ddd3c] /usr/bin/mongos(_ZN5mongo15StaticShardInfo4findERKSs+0x358) [0x76ebd8] /usr/bin/mongos(_ZN5mongo5Shard5resetERKSs+0x34) [0x76a1e4] /usr/bin/mongos(_ZN5mongo17checkShardVersionEPNS_12DBClientBaseERKSsN5boost10shared_ptrIKNS_12ChunkManagerEEEbi+0x8ea) [0x77029a] /usr/bin/mongos(_ZN5mongo14VersionManager19checkShardVersionCBEPNS_15ShardConnectionEbi+0x6a) [0x7724ba] /usr/bin/mongos(_ZN5mongo15ShardConnection11_finishInitEv+0xfc) [0x77637c] /usr/bin/mongos(_ZN5mongo13ShardStrategy7_insertERKSsRSt6vectorINS_7BSONObjESaIS4_EERSt3mapIN5boost10shared_ptrIKNS_5ChunkEEES6_St4lessISD_ESaISt4pairIKSD_S6_EEEiRNS_7RequestERNS_9DbMessageEi+0x1c0) [0x78e510] /usr/bin/mongos(_ZN5mongo13ShardStrategy7writeOpEiRNS_7RequestE+0x555) [0x792f25] /usr/bin/mongos(_ZN5mongo7Request7processEi+0xe8) [0x760ba8] /usr/bin/mongos(_ZN5mongo21ShardedMessageHandler7processERNS_7MessageEPNS_21AbstractMessagingPortEPNS_9LastErrorE+0x71) [0x500dd1] /usr/bin/mongos(_ZN5mongo3pms9threadRunEPNS_13MessagingPortE+0x415) [0x802f35] /lib/x86_64-linux-gnu/libpthread.so.0(+0x7e9a) [0x7fc74d8fae9a] /lib/x86_64-linux-gnu/libc.so.6(clone+0x6d) [0x7fc74cc0e31d] Mon Dec 8 06:33:15 [Balancer] distributed lock 'balancer/handprocessor003:37017:1417702041:1804289383' acquired, ts : 5485b6abe48200a3cd89c063 Mon Dec 8 06:33:15 [Balancer] distributed lock 'balancer/celery009:37017:1411743445:1804289383' unlocked. Mon Dec 8 06:33:15 [Balancer] distributed lock 'balancer/celery009:37017:1411743445:1804289383' acquired, ts : 5485b6abc16e738217f4315f Mon Dec 8 06:33:15 [Balancer] distributed lock 'balancer/persist026:37017:1401989330:1804289383' was not acquired. Mon Dec 8 06:33:15 [Balancer] could not acquire lock 'balancer/persist026:37017:1401989330:1804289383' (another update won) Mon Dec 8 06:33:15 [Balancer] distributed lock 'balancer/handprocessor002:37017:1417702043:1804289383' unlocked. Mon Dec 8 06:33:15 [Balancer] distributed lock 'balancer/persist031:37017:1415803447:1804289383' unlocked. Mon Dec 8 06:33:15 [Balancer] distributed lock 'balancer/persist031:37017:1415803447:1804289383' acquired, ts : 5485b6ab2373cfc96bd244b8 Mon Dec 8 06:33:14 [Balancer] distributed lock 'balancer/handprocessor002:37017:1417702043:1804289383' acquired, ts : 5485b6aa43559d4e44273811 Mon Dec 8 06:33:12 [Balancer] distributed lock 'balancer/persist027:37017:1401989364:1804289383' unlocked. Mon Dec 8 06:33:11 [Balancer] distributed lock 'balancer/persist027:37017:1401989364:1804289383' acquired, ts : 5485b6a712aeba6251704c6a Mon Dec 8 06:33:10 [Balancer] distributed lock 'balancer/persist025:37017:1411736275:1804289383' unlocked. Mon Dec 8 06:33:10 [Balancer] distributed lock 'balancer/persist025:37017:1411736275:1804289383' acquired, ts : 5485b6a6d68883f6fc4065ed Mon Dec 8 06:33:10 [Balancer] distributed lock 'balancer/admin:37017:1404422906:1804289383' unlocked. Mon Dec 8 06:33:10 [Balancer] distributed lock 'balancer/admin:37017:1404422906:1804289383' acquired, ts : 5485b6a6be87cb7e75ea9d62 Mon Dec 8 06:33:10 [Balancer] distributed lock 'balancer/celery007:37017:1411826479:1804289383' unlocked. Mon Dec 8 06:33:10 [Balancer] distributed lock 'balancer/celery007:37017:1411826479:1804289383' acquired, ts : 5485b6a5782ba83693223c11 Mon Dec 8 06:33:09 [Balancer] distributed lock 'balancer/handprocessor003:37017:1417702041:1804289383' unlocked. Mon Dec 8 06:33:09 [Balancer] distributed lock 'balancer/handprocessor003:37017:1417702041:1804289383' acquired, ts : 5485b6a5e48200a3cd89c062 Mon Dec 8 06:33:09 [Balancer] distributed lock 'balancer/persist029:37017:1415803043:1804289383' unlocked. Mon Dec 8 06:33:09 [Balancer] distributed lock 'balancer/persist029:37017:1415803043:1804289383' acquired, ts : 5485b6a509fbef4954584040 Mon Dec 8 06:33:09 [Balancer] distributed lock 'balancer/celery008:37017:1412085153:1804289383' unlocked. Mon Dec 8 06:33:09 [Balancer] distributed lock 'balancer/celery008:37017:1412085153:1804289383' acquired, ts : 5485b6a5dedaba9985300a64 Mon Dec 8 06:33:09 [Balancer] distributed lock 'balancer/persist026:37017:1401989330:1804289383' unlocked. Mon Dec 8 06:33:09 [LockPinger] handled late remove of old distributed lock with ts 5485b6982373cfc96bd244b7 Mon Dec 8 06:33:08 [Balancer] distributed lock 'balancer/persist026:37017:1401989330:1804289383' acquired, ts : 5485b6a4a22378eafc69af0c Mon Dec 8 06:33:08 [Balancer] lock update won, completing lock propagation for 'balancer/persist026:37017:1401989330:1804289383' Mon Dec 8 06:33:08 [Balancer] warning: distributed lock 'balancer/persist026:37017:1401989330:1804289383 did not propagate properly. :: caused by :: 8017 update not consistent ns: config.locks query: { _id: "balancer", state: 0, ts: ObjectId('5485b6a1d7cdbe42a83270eb') } update: { $set: { state: 1, who: "persist026:37017:1401989330:1804289383:Balancer:1461125836", process: "persist026:37017:1401989330:1804289383", when: new Date(1418049188733), why: "doing balance round", ts: ObjectId('5485b6a4a22378eafc69af0c') } } gle1: { updatedExisting: false, n: 0, connectionId: 145, waited: 20, err: null, ok: 1.0 } gle2: { updatedExisting: true, n: 1, connectionId: 923284, waited: 5, err: null, ok: 1.0 } Mon Dec 8 06:33:08 [Balancer] distributed lock 'balancer/handprocessor002:37017:1417702043:1804289383' was not acquired. Mon Dec 8 06:33:08 [Balancer] lock update lost, lock 'balancer/handprocessor002:37017:1417702043:1804289383' not propagated. Mon Dec 8 06:33:08 [Balancer] warning: distributed lock 'balancer/handprocessor002:37017:1417702043:1804289383 did not propagate properly. :: caused by :: 8017 update not consistent ns: config.locks query: { _id: "balancer", state: 0, ts: ObjectId('5485b6a1d7cdbe42a83270eb') } update: { $set: { state: 1, who: "handprocessor002:37017:1417702043:1804289383:Balancer:1897041701", process: "handprocessor002:37017:1417702043:1804289383", when: new Date(1418049188776), why: "doing balance round", ts: ObjectId('5485b6a443559d4e44273810') } } gle1: { updatedExisting: true, n: 1, connectionId: 100, waited: 20, err: null, ok: 1.0 } gle2: { updatedExisting: false, n: 0, connectionId: 1732181, waited: 5, err: null, ok: 1.0 } Mon Dec 8 06:33:08 [LockPinger] trying to delete 1 old lock entries for process persist031:37017:1415803447:1804289383 Mon Dec 8 06:33:05 [Balancer] distributed lock 'balancer/persist027:37017:1401989364:1804289383' unlocked. Mon Dec 8 06:33:05 [Balancer] distributed lock 'balancer/persist027:37017:1401989364:1804289383' acquired, ts : 5485b6a012aeba6251704c69 Mon Dec 8 06:33:04 [Balancer] distributed lock 'balancer/persist021:37017:1417712282:1804289383' unlocked. Mon Dec 8 06:33:04 [Balancer] distributed lock 'balancer/persist021:37017:1417712282:1804289383' acquired, ts : 5485b6a057c9913cc0d2f1c6 Mon Dec 8 06:33:04 [Balancer] distributed lock 'balancer/persist025:37017:1411736275:1804289383' unlocked. Mon Dec 8 06:33:04 [Balancer] distributed lock 'balancer/persist025:37017:1411736275:1804289383' acquired, ts : 5485b69fd68883f6fc4065ec Mon Dec 8 06:33:04 [Balancer] distributed lock 'balancer/atc005:37017:1415292838:1804289383' unlocked. Mon Dec 8 06:33:04 [Balancer] distributed lock 'balancer/atc005:37017:1415292838:1804289383' acquired, ts : 5485b6a0903bf7e88aaa2272 Mon Dec 8 06:33:03 [Balancer] distributed lock 'balancer/persist029:37017:1415803043:1804289383' unlocked. Mon Dec 8 06:33:03 [Balancer] distributed lock 'balancer/persist029:37017:1415803043:1804289383' acquired, ts : 5485b69e09fbef495458403f Mon Dec 8 06:33:02 [Balancer] distributed lock 'balancer/handprocessor001:37017:1417702049:1804289383' was not acquired. Mon Dec 8 06:33:02 [Balancer] could not acquire lock 'balancer/handprocessor001:37017:1417702049:1804289383' (another update won) Mon Dec 8 06:33:02 [Balancer] distributed lock 'balancer/handprocessor002:37017:1417702043:1804289383' unlocked. Mon Dec 8 06:33:02 [Balancer] distributed lock 'balancer/handprocessor002:37017:1417702043:1804289383' acquired, ts : 5485b69e43559d4e4427380f /usr/bin/mongos(_ZN5mongo15printStackTraceERSo+0x21) [0x814d51] /usr/bin/mongos(_ZN5mongo11msgassertedEiPKc+0x99) [0x7ddbb9] /usr/bin/mongos() [0x7ddd3c] /usr/bin/mongos(_ZN5mongo15StaticShardInfo4findERKSs+0x358) [0x76ebd8] /usr/bin/mongos(_ZN5mongo5Shard5resetERKSs+0x34) [0x76a1e4] /usr/bin/mongos(_ZN5mongo17checkShardVersionEPNS_12DBClientBaseERKSsN5boost10shared_ptrIKNS_12ChunkManagerEEEbi+0x8ea) [0x77029a] /usr/bin/mongos(_ZN5mongo14VersionManager19checkShardVersionCBEPNS_15ShardConnectionEbi+0x6a) [0x7724ba] /usr/bin/mongos(_ZN5mongo15ShardConnection11_finishInitEv+0xfc) [0x77637c] /usr/bin/mongos(_ZN5mongo13ShardStrategy7_insertERKSsRSt6vectorINS_7BSONObjESaIS4_EERSt3mapIN5boost10shared_ptrIKNS_5ChunkEEES6_St4lessISD_ESaISt4pairIKSD_S6_EEEiRNS_7RequestERNS_9DbMessageEi+0x1c0) [0x78e510] /usr/bin/mongos(_ZN5mongo13ShardStrategy7writeOpEiRNS_7RequestE+0x555) [0x792f25] /usr/bin/mongos(_ZN5mongo7Request7processEi+0xe8) [0x760ba8] /usr/bin/mongos(_ZN5mongo21ShardedMessageHandler7processERNS_7MessageEPNS_21AbstractMessagingPortEPNS_9LastErrorE+0x71) [0x500dd1] /usr/bin/mongos(_ZN5mongo3pms9threadRunEPNS_13MessagingPortE+0x415) [0x802f35] /lib/x86_64-linux-gnu/libpthread.so.0(+0x7e9a) [0x7f5637811e9a] /lib/x86_64-linux-gnu/libc.so.6(clone+0x6d) [0x7f5636b2531d] Mon Dec 8 06:32:58 [Balancer] distributed lock 'balancer/persist027:37017:1401989364:1804289383' unlocked. Mon Dec 8 06:32:58 [Balancer] distributed lock 'balancer/persist027:37017:1401989364:1804289383' acquired, ts : 5485b69a12aeba6251704c68 Mon Dec 8 06:32:58 [Balancer] distributed lock 'balancer/persist021:37017:1417712282:1804289383' unlocked. Mon Dec 8 06:32:58 [Balancer] distributed lock 'balancer/admin:37017:1404422906:1804289383' unlocked. Mon Dec 8 06:32:58 [Balancer] distributed lock 'balancer/admin:37017:1404422906:1804289383' acquired, ts : 5485b69abe87cb7e75ea9d61 Mon Dec 8 06:32:58 [LockPinger] creating distributed lock ping thread for mongo-livelog-a-2:27019,mongo-livelog-b-2:27019,mongo-livelog-c-2:27019 and process admin:37017:1404422906:1804289383 (sleeping for 30000ms) Mon Dec 8 06:32:58 [LockPinger] creating distributed lock ping thread for mongo-livelog-a-2:27019,mongo-livelog-b-2:27019,mongo-livelog-c-2:27019 and process atc005:37017:1415292838:1804289383 (sleeping for 30000ms) Mon Dec 8 06:32:57 [Balancer] distributed lock 'balancer/persist021:37017:1417712282:1804289383' acquired, ts : 5485b69957c9913cc0d2f1c5 Mon Dec 8 06:32:57 [LockPinger] creating distributed lock ping thread for mongo-livelog-a-2:27019,mongo-livelog-b-2:27019,mongo-livelog-c-2:27019 and process persist032:37017:1415803466:1804289383 (sleeping for 30000ms) Mon Dec 8 06:32:57 [Balancer] distributed lock 'balancer/handprocessor003:37017:1417702041:1804289383' unlocked. /usr/bin/mongos(_ZN5mongo15printStackTraceERSo+0x21) [0x814d51] /usr/bin/mongos(_ZN5mongo11msgassertedEiPKc+0x99) [0x7ddbb9] /usr/bin/mongos() [0x7ddd3c] /usr/bin/mongos(_ZN5mongo15StaticShardInfo4findERKSs+0x358) [0x76ebd8] /usr/bin/mongos(_ZN5mongo5Shard5resetERKSs+0x34) [0x76a1e4] /usr/bin/mongos(_ZN5mongo17checkShardVersionEPNS_12DBClientBaseERKSsN5boost10shared_ptrIKNS_12ChunkManagerEEEbi+0x8ea) [0x77029a] /usr/bin/mongos(_ZN5mongo14VersionManager19checkShardVersionCBEPNS_15ShardConnectionEbi+0x6a) [0x7724ba] /usr/bin/mongos(_ZN5mongo15ShardConnection11_finishInitEv+0xfc) [0x77637c] /usr/bin/mongos(_ZN5mongo13ShardStrategy7_insertERKSsRSt6vectorINS_7BSONObjESaIS4_EERSt3mapIN5boost10shared_ptrIKNS_5ChunkEEES6_St4lessISD_ESaISt4pairIKSD_S6_EEEiRNS_7RequestERNS_9DbMessageEi+0x1c0) [0x78e510] /usr/bin/mongos(_ZN5mongo13ShardStrategy7writeOpEiRNS_7RequestE+0x555) [0x792f25] /usr/bin/mongos(_ZN5mongo7Request7processEi+0xe8) [0x760ba8] /usr/bin/mongos(_ZN5mongo21ShardedMessageHandler7processERNS_7MessageEPNS_21AbstractMessagingPortEPNS_9LastErrorE+0x71) [0x500dd1] /usr/bin/mongos(_ZN5mongo3pms9threadRunEPNS_13MessagingPortE+0x415) [0x802f35] /lib/x86_64-linux-gnu/libpthread.so.0(+0x7e9a) [0x7fc74d8fae9a] /lib/x86_64-linux-gnu/libc.so.6(clone+0x6d) [0x7fc74cc0e31d] Mon Dec 8 06:32:57 [Balancer] distributed lock 'balancer/handprocessor003:37017:1417702041:1804289383' acquired, ts : 5485b698e48200a3cd89c061 Mon Dec 8 06:32:57 [Balancer] lock update won, completing lock propagation for 'balancer/handprocessor003:37017:1417702041:1804289383' Mon Dec 8 06:32:57 [Balancer] warning: distributed lock 'balancer/handprocessor003:37017:1417702041:1804289383 did not propagate properly. :: caused by :: 8017 update not consistent ns: config.locks query: { _id: "balancer", state: 0, ts: ObjectId('5485b6963fead8ae6b95b669') } update: { $set: { state: 1, who: "handprocessor003:37017:1417702041:1804289383:Balancer:132982846", process: "handprocessor003:37017:1417702041:1804289383", when: new Date(1418049176951), why: "doing balance round", ts: ObjectId('5485b698e48200a3cd89c061') } } gle1: { updatedExisting: false, n: 0, connectionId: 116, waited: 27, err: null, ok: 1.0 } gle2: { updatedExisting: true, n: 1, connectionId: 1751349, waited: 4, err: null, ok: 1.0 } Mon Dec 8 06:32:57 [Balancer] distributed lock 'balancer/persist025:37017:1411736275:1804289383' unlocked. Mon Dec 8 06:32:57 [Balancer] distributed lock 'balancer/persist025:37017:1411736275:1804289383' acquired, ts : 5485b699d68883f6fc4065eb Mon Dec 8 06:32:57 [Balancer] distributed lock 'balancer/persist031:37017:1415803447:1804289383' was not acquired. Mon Dec 8 06:32:57 [Balancer] lock update lost, lock 'balancer/persist031:37017:1415803447:1804289383' not propagated. Mon Dec 8 06:32:57 [Balancer] warning: distributed lock 'balancer/persist031:37017:1415803447:1804289383 did not propagate properly. :: caused by :: 8017 update not consistent ns: config.locks query: { _id: "balancer", state: 0, ts: ObjectId('5485b6963fead8ae6b95b669') } update: { $set: { state: 1, who: "persist031:37017:1415803447:1804289383:Balancer:835514025", process: "persist031:37017:1415803447:1804289383", when: new Date(1418049176947), why: "doing balance round", ts: ObjectId('5485b6982373cfc96bd244b7') } } gle1: { updatedExisting: true, n: 1, connectionId: 39, waited: 27, err: null, ok: 1.0 } gle2: { updatedExisting: false, n: 0, connectionId: 1629575, waited: 4, err: null, ok: 1.0 } Mon Dec 8 06:32:56 [Balancer] distributed lock 'balancer/persist029:37017:1415803043:1804289383' was not acquired. Mon Dec 8 06:32:56 [Balancer] could not acquire lock 'balancer/persist029:37017:1415803043:1804289383' (another update won) Mon Dec 8 06:32:56 [Balancer] caught exception while doing balance: exception creating distributed lock balancer/persist026:37017:1401989330:1804289383 :: caused by :: SyncClusterConnection::udpate prepare failed: 9001 socket exception [2] server [172.30.68.121:27019] mongo-livelog-c-2:27019:{} Mon Dec 8 06:32:56 [LockPinger] creating distributed lock ping thread for mongo-livelog-a-2:27019,mongo-livelog-b-2:27019,mongo-livelog-c-2:27019 and process persist026:37017:1401989330:1804289383 (sleeping for 30000ms) Mon Dec 8 06:32:56 [LockPinger] creating distributed lock ping thread for mongo-livelog-a-2:27019,mongo-livelog-b-2:27019,mongo-livelog-c-2:27019 and process handprocessor002:37017:1417702043:1804289383 (sleeping for 30000ms) Mon Dec 8 06:32:56 [Balancer] distributed lock 'balancer/persist028:37017:1415803036:1804289383' unlocked. Mon Dec 8 06:32:56 [Balancer] distributed lock 'balancer/persist028:37017:1415803036:1804289383' acquired, ts : 5485b6963fead8ae6b95b669 Mon Dec 8 06:32:56 [LockPinger] creating distributed lock ping thread for mongo-livelog-a-2:27019,mongo-livelog-b-2:27019,mongo-livelog-c-2:27019 and process persist029:37017:1415803043:1804289383 (sleeping for 30000ms) Mon Dec 8 06:32:52 [Balancer] distributed lock 'balancer/persist027:37017:1401989364:1804289383' unlocked. Mon Dec 8 06:32:52 [Balancer] distributed lock 'balancer/persist027:37017:1401989364:1804289383' acquired, ts : 5485b69412aeba6251704c67 Mon Dec 8 06:32:51 [Balancer] distributed lock 'balancer/persist030:37017:1415803442:1804289383' unlocked. Mon Dec 8 06:32:51 [Balancer] distributed lock 'balancer/persist030:37017:1415803442:1804289383' acquired, ts : 5485b6933b6945bd1536a872 Mon Dec 8 06:32:51 [Balancer] distributed lock 'balancer/celery008:37017:1412085153:1804289383' unlocked. Mon Dec 8 06:32:51 [Balancer] distributed lock 'balancer/celery008:37017:1412085153:1804289383' acquired, ts : 5485b692dedaba9985300a63 Mon Dec 8 06:32:51 [Balancer] distributed lock 'balancer/celery007:37017:1411826479:1804289383' unlocked. Mon Dec 8 06:32:51 [Balancer] distributed lock 'balancer/celery007:37017:1411826479:1804289383' acquired, ts : 5485b693782ba83693223c10 Mon Dec 8 06:32:51 [LockPinger] creating distributed lock ping thread for mongo-livelog-a-2:27019,mongo-livelog-b-2:27019,mongo-livelog-c-2:27019 and process persist021:37017:1417712282:1804289383 (sleeping for 30000ms) Mon Dec 8 06:32:50 [Balancer] distributed lock 'balancer/handprocessor003:37017:1417702041:1804289383' was not acquired. Mon Dec 8 06:32:50 [Balancer] could not acquire lock 'balancer/handprocessor003:37017:1417702041:1804289383' (another update won) Mon Dec 8 06:32:50 [Balancer] distributed lock 'balancer/persist031:37017:1415803447:1804289383' was not acquired. Mon Dec 8 06:32:50 [Balancer] could not acquire lock 'balancer/persist031:37017:1415803447:1804289383' (another update won) Mon Dec 8 06:32:49 [Balancer] distributed lock 'balancer/handprocessor001:37017:1417702049:1804289383' unlocked. Mon Dec 8 06:32:49 [Balancer] distributed lock 'balancer/handprocessor001:37017:1417702049:1804289383' acquired, ts : 5485b691a0bc80f775bfebf6 /usr/bin/mongos(_ZN5mongo15printStackTraceERSo+0x21) [0x814d51] /usr/bin/mongos(_ZN5mongo11msgassertedEiPKc+0x99) [0x7ddbb9] /usr/bin/mongos() [0x7ddd3c] /usr/bin/mongos(_ZN5mongo15StaticShardInfo4findERKSs+0x358) [0x76ebd8] /usr/bin/mongos(_ZN5mongo5Shard5resetERKSs+0x34) [0x76a1e4] /usr/bin/mongos(_ZN5mongo17checkShardVersionEPNS_12DBClientBaseERKSsN5boost10shared_ptrIKNS_12ChunkManagerEEEbi+0x8ea) [0x77029a] /usr/bin/mongos(_ZN5mongo14VersionManager19checkShardVersionCBEPNS_15ShardConnectionEbi+0x6a) [0x7724ba] /usr/bin/mongos(_ZN5mongo15ShardConnection11_finishInitEv+0xfc) [0x77637c] /usr/bin/mongos(_ZN5mongo13ShardStrategy7_insertERKSsRSt6vectorINS_7BSONObjESaIS4_EERSt3mapIN5boost10shared_ptrIKNS_5ChunkEEES6_St4lessISD_ESaISt4pairIKSD_S6_EEEiRNS_7RequestERNS_9DbMessageEi+0x1c0) [0x78e510] /usr/bin/mongos(_ZN5mongo13ShardStrategy7writeOpEiRNS_7RequestE+0x555) [0x792f25] /usr/bin/mongos(_ZN5mongo7Request7processEi+0xe8) [0x760ba8] /usr/bin/mongos(_ZN5mongo21ShardedMessageHandler7processERNS_7MessageEPNS_21AbstractMessagingPortEPNS_9LastErrorE+0x71) [0x500dd1] /usr/bin/mongos(_ZN5mongo3pms9threadRunEPNS_13MessagingPortE+0x415) [0x802f35] /lib/x86_64-linux-gnu/libpthread.so.0(+0x7e9a) [0x7ff59474be9a] /lib/x86_64-linux-gnu/libc.so.6(clone+0x6d) [0x7ff593a5f2ed] Mon Dec 8 06:32:48 [Balancer] distributed lock 'balancer/persist028:37017:1415803036:1804289383' unlocked. Mon Dec 8 06:32:48 [Balancer] distributed lock 'balancer/persist028:37017:1415803036:1804289383' acquired, ts : 5485b6903fead8ae6b95b668 Mon Dec 8 06:32:48 [LockPinger] creating distributed lock ping thread for mongo-livelog-a-2:27019,mongo-livelog-b-2:27019,mongo-livelog-c-2:27019 and process persist028:37017:1415803036:1804289383 (sleeping for 30000ms) Mon Dec 8 06:32:46 [Balancer] caught exception while doing balance: exception creating distributed lock balancer/persist027:37017:1401989364:1804289383 :: caused by :: SyncClusterConnection::udpate prepare failed: 9001 socket exception [2] server [172.30.68.121:27019] mongo-livelog-c-2:27019:{} Mon Dec 8 06:32:46 [LockPinger] creating distributed lock ping thread for mongo-livelog-a-2:27019,mongo-livelog-b-2:27019,mongo-livelog-c-2:27019 and process persist027:37017:1401989364:1804289383 (sleeping for 30000ms) Mon Dec 8 06:32:45 [Balancer] distributed lock 'balancer/persist025:37017:1411736275:1804289383' was not acquired. Mon Dec 8 06:32:45 [Balancer] could not acquire lock 'balancer/persist025:37017:1411736275:1804289383' (another update won) Mon Dec 8 06:32:45 [Balancer] distributed lock 'balancer/celery009:37017:1411743445:1804289383' was not acquired. /usr/bin/mongos(_ZN5mongo15printStackTraceERSo+0x21) [0x814d51] /usr/bin/mongos(_ZN5mongo11msgassertedEiPKc+0x99) [0x7ddbb9] /usr/bin/mongos() [0x7ddd3c] /usr/bin/mongos(_ZN5mongo15StaticShardInfo4findERKSs+0x358) [0x76ebd8] /usr/bin/mongos(_ZN5mongo5Shard5resetERKSs+0x34) [0x76a1e4] /usr/bin/mongos(_ZN5mongo17checkShardVersionEPNS_12DBClientBaseERKSsN5boost10shared_ptrIKNS_12ChunkManagerEEEbi+0x8ea) [0x77029a] /usr/bin/mongos(_ZN5mongo14VersionManager19checkShardVersionCBEPNS_15ShardConnectionEbi+0x6a) [0x7724ba] /usr/bin/mongos(_ZN5mongo15ShardConnection11_finishInitEv+0xfc) [0x77637c] /usr/bin/mongos(_ZN5mongo13ShardStrategy7_insertERKSsRSt6vectorINS_7BSONObjESaIS4_EERSt3mapIN5boost10shared_ptrIKNS_5ChunkEEES6_St4lessISD_ESaISt4pairIKSD_S6_EEEiRNS_7RequestERNS_9DbMessageEi+0x1c0) [0x78e510] /usr/bin/mongos(_ZN5mongo13ShardStrategy7writeOpEiRNS_7RequestE+0x555) [0x792f25] /usr/bin/mongos(_ZN5mongo7Request7processEi+0xe8) [0x760ba8] /usr/bin/mongos(_ZN5mongo21ShardedMessageHandler7processERNS_7MessageEPNS_21AbstractMessagingPortEPNS_9LastErrorE+0x71) [0x500dd1] /usr/bin/mongos(_ZN5mongo3pms9threadRunEPNS_13MessagingPortE+0x415) [0x802f35] /lib/x86_64-linux-gnu/libpthread.so.0(+0x7e9a) [0x7f0497630e9a] /lib/x86_64-linux-gnu/libc.so.6(clone+0x6d) [0x7f049694431d] Mon Dec 8 06:32:45 [Balancer] could not acquire lock 'balancer/celery009:37017:1411743445:1804289383' (another update won) Mon Dec 8 06:32:45 [Balancer] distributed lock 'balancer/celery007:37017:1411826479:1804289383' unlocked. Mon Dec 8 06:32:45 [Balancer] distributed lock 'balancer/celery007:37017:1411826479:1804289383' acquired, ts : 5485b68c782ba83693223c0f Mon Dec 8 06:32:44 [LockPinger] creating distributed lock ping thread for mongo-livelog-a-2:27019,mongo-livelog-b-2:27019,mongo-livelog-c-2:27019 and process celery008:37017:1412085153:1804289383 (sleeping for 30000ms) Mon Dec 8 06:32:44 [LockPinger] creating distributed lock ping thread for mongo-livelog-a-2:27019,mongo-livelog-b-2:27019,mongo-livelog-c-2:27019 and process celery009:37017:1411743445:1804289383 (sleeping for 30000ms) Mon Dec 8 06:32:43 [Balancer] distributed lock 'balancer/handprocessor001:37017:1417702049:1804289383' unlocked. Mon Dec 8 06:32:42 [Balancer] distributed lock 'balancer/handprocessor001:37017:1417702049:1804289383' acquired, ts : 5485b68aa0bc80f775bfebf5 Mon Dec 8 06:32:39 [Balancer] distributed lock 'balancer/persist030:37017:1415803442:1804289383' unlocked. Mon Dec 8 06:32:38 [Balancer] distributed lock 'balancer/persist030:37017:1415803442:1804289383' acquired, ts : 5485b6863b6945bd1536a871 Mon Dec 8 06:32:38 [Balancer] distributed lock 'balancer/celery007:37017:1411826479:1804289383' was not acquired. Mon Dec 8 06:32:38 [Balancer] could not acquire lock 'balancer/celery007:37017:1411826479:1804289383' (another update won) Mon Dec 8 06:32:38 [Balancer] distributed lock 'balancer/persist031:37017:1415803447:1804289383' was not acquired. Mon Dec 8 06:32:38 [Balancer] could not acquire lock 'balancer/persist031:37017:1415803447:1804289383' (another update won) Mon Dec 8 06:32:38 [LockPinger] creating distributed lock ping thread for mongo-livelog-a-2:27019,mongo-livelog-b-2:27019,mongo-livelog-c-2:27019 and process persist031:37017:1415803447:1804289383 (sleeping for 30000ms) Mon Dec 8 06:32:38 [LockPinger] creating distributed lock ping thread for mongo-livelog-a-2:27019,mongo-livelog-b-2:27019,mongo-livelog-c-2:27019 and process persist030:37017:1415803442:1804289383 (sleeping for 30000ms) Mon Dec 8 06:32:38 [Balancer] distributed lock 'balancer/persist025:37017:1411736275:1804289383' unlocked. Mon Dec 8 06:32:38 [Balancer] distributed lock 'balancer/persist025:37017:1411736275:1804289383' acquired, ts : 5485b686d68883f6fc4065e9 Mon Dec 8 06:32:36 [Balancer] distributed lock 'balancer/handprocessor001:37017:1417702049:1804289383' unlocked. 0x814d51 0x7ddbb9 0x7ddd3c 0x76ebd8 0x76a1e4 0x77029a 0x7724ba 0x77637c 0x78e510 0x792f25 0x760ba8 0x500dd1 0x802f35 0x7ff59474be9a 0x7ff593a5f2ed /usr/bin/mongos(_ZN5mongo15printStackTraceERSo+0x21) [0x814d51] /usr/bin/mongos(_ZN5mongo11msgassertedEiPKc+0x99) [0x7ddbb9] /usr/bin/mongos() [0x7ddd3c] /usr/bin/mongos(_ZN5mongo15StaticShardInfo4findERKSs+0x358) [0x76ebd8] /usr/bin/mongos(_ZN5mongo5Shard5resetERKSs+0x34) [0x76a1e4] /usr/bin/mongos(_ZN5mongo17checkShardVersionEPNS_12DBClientBaseERKSsN5boost10shared_ptrIKNS_12ChunkManagerEEEbi+0x8ea) [0x77029a] /usr/bin/mongos(_ZN5mongo14VersionManager19checkShardVersionCBEPNS_15ShardConnectionEbi+0x6a) [0x7724ba] /usr/bin/mongos(_ZN5mongo15ShardConnection11_finishInitEv+0xfc) [0x77637c] /usr/bin/mongos(_ZN5mongo13ShardStrategy7_insertERKSsRSt6vectorINS_7BSONObjESaIS4_EERSt3mapIN5boost10shared_ptrIKNS_5ChunkEEES6_St4lessISD_ESaISt4pairIKSD_S6_EEEiRNS_7RequestERNS_9DbMessageEi+0x1c0) [0x78e510] /usr/bin/mongos(_ZN5mongo13ShardStrategy7writeOpEiRNS_7RequestE+0x555) [0x792f25] /usr/bin/mongos(_ZN5mongo7Request7processEi+0xe8) [0x760ba8] /usr/bin/mongos(_ZN5mongo21ShardedMessageHandler7processERNS_7MessageEPNS_21AbstractMessagingPortEPNS_9LastErrorE+0x71) [0x500dd1] /usr/bin/mongos(_ZN5mongo3pms9threadRunEPNS_13MessagingPortE+0x415) [0x802f35] /lib/x86_64-linux-gnu/libpthread.so.0(+0x7e9a) [0x7ff59474be9a] /lib/x86_64-linux-gnu/libc.so.6(clone+0x6d) [0x7ff593a5f2ed] Mon Dec 8 06:32:36 [Balancer] distributed lock 'balancer/handprocessor001:37017:1417702049:1804289383' acquired, ts : 5485b684a0bc80f775bfebf4 Mon Dec 8 06:32:36 [LockPinger] creating distributed lock ping thread for mongo-livelog-a-2:27019,mongo-livelog-b-2:27019,mongo-livelog-c-2:27019 and process handprocessor001:37017:1417702049:1804289383 (sleeping for 30000ms) Mon Dec 8 06:32:32 [Balancer] distributed lock 'balancer/handprocessor003:37017:1417702041:1804289383' unlocked. Mon Dec 8 06:32:32 [LockPinger] creating distributed lock ping thread for mongo-livelog-a-2:27019,mongo-livelog-b-2:27019,mongo-livelog-c-2:27019 and process celery007:37017:1411826479:1804289383 (sleeping for 30000ms) Mon Dec 8 06:32:32 [Balancer] distributed lock 'balancer/persist025:37017:1411736275:1804289383' was not acquired. Mon Dec 8 06:32:32 [Balancer] could not acquire lock 'balancer/persist025:37017:1411736275:1804289383' (another update won) Mon Dec 8 06:32:32 [LockPinger] creating distributed lock ping thread for mongo-livelog-a-2:27019,mongo-livelog-b-2:27019,mongo-livelog-c-2:27019 and process persist025:37017:1411736275:1804289383 (sleeping for 30000ms) Mon Dec 8 06:32:32 [Balancer] distributed lock 'balancer/handprocessor003:37017:1417702041:1804289383' acquired, ts : 5485b680e48200a3cd89c05f Mon Dec 8 06:32:32 [LockPinger] creating distributed lock ping thread for mongo-livelog-a-2:27019,mongo-livelog-b-2:27019,mongo-livelog-c-2:27019 and process handprocessor003:37017:1417702041:1804289383 (sleeping for 30000ms) Mon Dec 8 06:29:28 [conn9795] warning: splitChunk failed - cmd: { splitChunk: "live.hand_details_log_admin_6", keyPattern: { shard_hash: 1 }, min: { shard_hash: ObjectId('eb81ca345914d216eecb6473') }, max: { shard_hash: ObjectId('ec0cfa04e64723dacdef7ba3') }, from: "mongo-livelog-c", splitKeys: [ { shard_hash: ObjectId('eba64025720ee674788e8b69') } ], shardId: "live.hand_details_log_admin_6-shard_hash_ObjectId('eb81ca345914d216eecb6473')", configdb: "mongo-livelog-a-2:27019,mongo-livelog-b-2:27019,mongo-livelog-c-2:27019" } result: { errmsg: "Error locking distributed lock for split. :: caused by :: 13663 exception creating distributed lock live.hand_details_log_admin_6/mongo-livelog-c-1:27...", ok: 0.0 } Mon Dec 8 06:29:19 [conn10703] warning: splitChunk failed - cmd: { splitChunk: "live.hand_details_log_admin_6", keyPattern: { shard_hash: 1 }, min: { shard_hash: ObjectId('eb81ca345914d216eecb6473') }, max: { shard_hash: ObjectId('ec0cfa04e64723dacdef7ba3') }, from: "mongo-livelog-c", splitKeys: [ { shard_hash: ObjectId('eba64025720ee674788e8b69') } ], shardId: "live.hand_details_log_admin_6-shard_hash_ObjectId('eb81ca345914d216eecb6473')", configdb: "mongo-livelog-a-2:27019,mongo-livelog-b-2:27019,mongo-livelog-c-2:27019" } result: { errmsg: "Error locking distributed lock for split. :: caused by :: 13663 exception creating distributed lock live.hand_details_log_admin_6/mongo-livelog-c-1:27...", ok: 0.0 } Mon Dec 8 06:29:15 [conn10945] warning: splitChunk failed - cmd: { splitChunk: "live.hand_details_log_admin_6", keyPattern: { shard_hash: 1 }, min: { shard_hash: ObjectId('eb81ca345914d216eecb6473') }, max: { shard_hash: ObjectId('ec0cfa04e64723dacdef7ba3') }, from: "mongo-livelog-c", splitKeys: [ { shard_hash: ObjectId('eba64025720ee674788e8b69') } ], shardId: "live.hand_details_log_admin_6-shard_hash_ObjectId('eb81ca345914d216eecb6473')", configdb: "mongo-livelog-a-2:27019,mongo-livelog-b-2:27019,mongo-livelog-c-2:27019" } result: { errmsg: "Error locking distributed lock for split. :: caused by :: 13663 exception creating distributed lock live.hand_details_log_admin_6/mongo-livelog-c-1:27...", ok: 0.0 } Mon Dec 8 06:29:15 [conn12038] warning: splitChunk failed - cmd: { splitChunk: "live.hand_details_log_debug_6", keyPattern: { shard_hash: 1 }, min: { shard_hash: ObjectId('de8dbf68ad051b3bb4cc20c7') }, max: { shard_hash: ObjectId('dea4e59bbcd5d468e6e596d3') }, from: "mongo-livelog-c", splitKeys: [ { shard_hash: ObjectId('de97d15f9c66aae90661c226') } ], shardId: "live.hand_details_log_debug_6-shard_hash_ObjectId('de8dbf68ad051b3bb4cc20c7')", configdb: "mongo-livelog-a-2:27019,mongo-livelog-b-2:27019,mongo-livelog-c-2:27019" } result: { errmsg: "Error locking distributed lock for split. :: caused by :: 13663 exception creating distributed lock live.hand_details_log_debug_6/mongo-livelog-c-1:27...", ok: 0.0 } Mon Dec 8 06:29:06 [conn11025] warning: splitChunk failed - cmd: { splitChunk: "live.hand_details_log_admin_6", keyPattern: { shard_hash: 1 }, min: { shard_hash: ObjectId('eb81ca345914d216eecb6473') }, max: { shard_hash: ObjectId('ec0cfa04e64723dacdef7ba3') }, from: "mongo-livelog-c", splitKeys: [ { shard_hash: ObjectId('eba64025720ee674788e8b69') } ], shardId: "live.hand_details_log_admin_6-shard_hash_ObjectId('eb81ca345914d216eecb6473')", configdb: "mongo-livelog-a-2:27019,mongo-livelog-b-2:27019,mongo-livelog-c-2:27019" } result: { errmsg: "Error locking distributed lock for split. :: caused by :: 13663 exception creating distributed lock live.hand_details_log_admin_6/mongo-livelog-c-1:27...", ok: 0.0 }