the Int type in Swift is tricky to handle correctly, because its bitwidth is machine-dependent. But it is also used far more often than Int32 and Int64, so it's important that the behavior is intuitive to users.
The current behavior is as follows:
Putting into documents:
- If the Int will fit in an Int32, its bsonType is considered .int32 and it will be encoded to the document as an Int32.
- Otherwise, its bsonType is .int64 and it will be encoded to the document as an Int64.
Reading out of documents:
- Any value encoded as an Int32 will be returned as an Int
- Any value encoded as an Int64 will be returned as an Int64
There are some problems with this behavior. As Jeremy Mikola pointed out: consider a user who stores an Int, which could be anything in [Int.min, Int.max].
When the user retrieves the value for that key, they will get back a BsonValue?. If the user then wants to cast the value to a type, they must either know ahead of time how big the value is, or alternatively switch on the bsonType property, or just try both Int and Int64 and see which works.
It's not clear what the best solution is, but we should aim for something with consistent/predictable behavior that does not require the user having prior knowledge of the value stored.
The way MongoKitten handles this is by always encoding Int as Int64.
Inspired by that approach, one possibility is:
- Always convert an Int to an Int64 and then write it to the doc.
- When reading an integer value from a doc, return Int32 or Int64. whatever it was encoded as
- predictable behavior. Int32 and Int64 preserve their types when round-tripped, and Int always becomes Int64.
- on a 64-bit platform, half of the Int range can be represented with just 32 bits. on a 32-bit platform, the entire Int range can be represented with just 32 bits. so this isn't particularly efficient in those cases space-wise.
- Int is used a lot more commonly throughout Swift libraries than Int32 and Int64, so users are going to have to do a lot of casting.