Uploaded image for project: 'Swift Driver'
  1. Swift Driver
  2. SWIFT-154

Int should encode as Int32 or Int64 as needed

    XMLWordPrintable

    Details

    • Type: Improvement
    • Status: Closed
    • Priority: Major - P3
    • Resolution: Fixed
    • Affects Version/s: None
    • Fix Version/s: 0.0.3
    • Component/s: None
    • Labels:
      None

      Description

      The BsonValue.encode() implementation for Int requires that the integer's value falls within the range of a 32-bit signed integer. If Int contains a value outside of this range (possible based on the architecture), an error is raised that suggests Int64 be used instead. It should be possible for the driver to detect which BSON type is most appropriate.

      I understand that when decoding from BSON, the driver converts Int32 and Int64 BSON types to a Swift Int and Int64, respectively. I assume Int was chosen for flexibility, as it jives with the Swift documentation:

      Unless you need to work with a specific size of integer, always use Int for integer values in your code. This aids code consistency and interoperability.

      Likewise, it's quite possible that applications will use an Int on a 64-bit platform to represent a 64-bit value and expect it to encode to BSON without error.

        Attachments

          Activity

            People

            Assignee:
            jmikola Jeremy Mikola
            Reporter:
            jmikola Jeremy Mikola
            Votes:
            0 Vote for this issue
            Watchers:
            0 Start watching this issue

              Dates

              Created:
              Updated:
              Resolved: