-
Type: Build Failure
-
Resolution: Fixed
-
Priority: Unknown
-
Affects Version/s: None
-
Component/s: None
-
None
-
Python Drivers
-
Not Needed
-
Name of Failure:
test.asynchronous.test_client_bulk_write.TestClientBulkWriteTimeout.test_timeout_in_multi_batch_bulk_write
Link to task:
Context of when and why the failure occurred:
Test added by PYTHON-4550
Stack trace:
[2024/08/08 10:43:17.989] FAILURE: AssertionError: 1 != 2 () [2024/08/08 10:43:17.989] self = <test.asynchronous.test_client_bulk_write.TestClientBulkWriteTimeout testMethod=test_timeout_in_multi_batch_bulk_write> [2024/08/08 10:43:17.989] @async_client_context.require_version_min(8, 0, 0, -24) [2024/08/08 10:43:17.989] @async_client_context.require_failCommand_fail_point [2024/08/08 10:43:17.989] async def test_timeout_in_multi_batch_bulk_write(self): [2024/08/08 10:43:17.989] internal_client = await async_rs_or_single_client(timeoutMS=None) [2024/08/08 10:43:17.989] self.addAsyncCleanup(internal_client.aclose) [2024/08/08 10:43:17.989] [2024/08/08 10:43:17.989] collection = internal_client.db["coll"] [2024/08/08 10:43:17.989] self.addAsyncCleanup(collection.drop) [2024/08/08 10:43:17.989] await collection.drop() [2024/08/08 10:43:17.989] [2024/08/08 10:43:17.989] max_bson_object_size = (await async_client_context.hello)["maxBsonObjectSize"] [2024/08/08 10:43:17.989] max_message_size_bytes = (await async_client_context.hello)["maxMessageSizeBytes"] [2024/08/08 10:43:17.989] fail_command = { [2024/08/08 10:43:17.989] "configureFailPoint": "failCommand", [2024/08/08 10:43:17.989] "mode": {"times": 2}, [2024/08/08 10:43:17.989] "data": {"failCommands": ["bulkWrite"], "blockConnection": True, "blockTimeMS": 1010}, [2024/08/08 10:43:17.989] } [2024/08/08 10:43:17.989] async with self.fail_point(fail_command): [2024/08/08 10:43:17.989] models = [] [2024/08/08 10:43:17.989] num_models = int(max_message_size_bytes / max_bson_object_size + 1) [2024/08/08 10:43:17.989] b_repeated = "b" * (max_bson_object_size - 500) [2024/08/08 10:43:17.989] for _ in range(num_models): [2024/08/08 10:43:17.989] models.append( [2024/08/08 10:43:17.989] InsertOne( [2024/08/08 10:43:17.989] namespace="db.coll", [2024/08/08 10:43:17.989] document={"a": b_repeated}, [2024/08/08 10:43:17.989] ) [2024/08/08 10:43:17.989] ) [2024/08/08 10:43:17.989] [2024/08/08 10:43:17.989] listener = OvertCommandListener() [2024/08/08 10:43:17.989] client = await async_rs_or_single_client( [2024/08/08 10:43:17.989] event_listeners=[listener], [2024/08/08 10:43:17.989] readConcernLevel="majority", [2024/08/08 10:43:17.989] readPreference="primary", [2024/08/08 10:43:17.989] timeoutMS=2000, [2024/08/08 10:43:17.989] w="majority", [2024/08/08 10:43:17.989] ) [2024/08/08 10:43:17.989] self.addAsyncCleanup(client.aclose) [2024/08/08 10:43:17.989] with self.assertRaises(ClientBulkWriteException) as context: [2024/08/08 10:43:17.989] await client.bulk_write(models=models) [2024/08/08 10:43:17.989] self.assertIsInstance(context.exception.error, NetworkTimeout) [2024/08/08 10:43:17.989] [2024/08/08 10:43:17.989] bulk_write_events = [] [2024/08/08 10:43:17.989] for event in listener.started_events: [2024/08/08 10:43:17.989] if event.command_name == "bulkWrite": [2024/08/08 10:43:17.989] bulk_write_events.append(event) [2024/08/08 10:43:17.989] > self.assertEqual(len(bulk_write_events), 2) [2024/08/08 10:43:17.989] E AssertionError: 1 != 2 [2024/08/08 10:43:17.989] test/asynchronous/test_client_bulk_write.py:571: AssertionError