Uploaded image for project: 'C# Driver'
  1. C# Driver
  2. CSHARP-4900

Uploading a duplicate file larger than the original one causes errors in downloading the original file

    • Type: Icon: Bug Bug
    • Resolution: Fixed
    • Priority: Icon: Unknown Unknown
    • 2.24.0
    • Affects Version/s: 2.23.1
    • Component/s: GridFS
    • Labels:
      None
    • Fully Compatible
    • Not Needed
    • Hide

      1. What would you like to communicate to the user about this feature?
      2. Would you like the user to see examples of the syntax and/or executable code and its output?
      3. Which versions of the driver/connector does this apply to?

      Show
      1. What would you like to communicate to the user about this feature? 2. Would you like the user to see examples of the syntax and/or executable code and its output? 3. Which versions of the driver/connector does this apply to?

      Summary

      Uploading a duplicate file (file_id) larger than the original causes errors in downloading the original file. The operation completes with DuplicateKey but the chunks collection contains parts of second file. Next, an attempt to download the original file fails with GridFSChunk exception.

      Please provide the version of the driver. If applicable, please provide the MongoDB server version and topology (standalone, replica set, or sharded cluster).

      MongoDB 6.0.4/windows standalone/docker cluster with shards/windows cluster with shards

      MongoDB.Driver.GridFS 2.23.1

      How to Reproduce

      using MongoDB.Driver;
      using MongoDB.Driver.GridFS;
       
      var r = new Random();
      var content1 = new byte[10];
      var content2 = new byte[36_000_000];
      r.NextBytes(content1);
      r.NextBytes(content2);
       
      string fileId = "1";
       
      var client = new MongoClient();
      client.DropDatabase("TestDuplicate");
      var db = client.GetDatabase("TestDuplicate");
      var bucket = new GridFSBucket<string>(db);
       
      bucket.UploadFromBytes(fileId, "unrelevant", content1);
       
      try
      {
          bucket.UploadFromBytes(fileId, "unrelevant", content2);
      }
      catch (MongoBulkWriteException e)
      {
          Console.WriteLine(e.Message);
          // A bulk write operation resulted in one or more errors.
          // WriteErrors: [ { Category : "DuplicateKey", Code : 11000, Message : "E11000 duplicate key error collection: TestDuplicate.fs.chunks index: files_id_1_n_1 dup key: { files_id: "1", n: 0 }" } ]
      }
       
      try
      {
          var read = bucket.DownloadAsBytes(fileId);
      }
      catch (GridFSChunkException e)
      {
          Console.WriteLine(e.Message);
          // GridFS chunk 1 of file id 1 is missing
      } 

       

      Additional Background

      The chunks collection contains one chunk from the original file (n: 0), and 64 chunks (n: 64..127) from the duplicate file.

            Assignee:
            oleksandr.poliakov@mongodb.com Oleksandr Poliakov
            Reporter:
            marek.kedziora@kdpw.pl Marek Kedziora
            Votes:
            0 Vote for this issue
            Watchers:
            3 Start watching this issue

              Created:
              Updated:
              Resolved: