- 
    Type:
Bug
 - 
    Resolution: Done
 - 
    Priority:
Major - P3
 - 
    Affects Version/s: 2.0.0-rc1
 - 
    Component/s: Configuration
 - 
    None
 - 
    Environment:Windows 10, Spark 2.0.0, Scala 2.11.2
 
- 
        None
 - 
        None
 - 
        None
 - 
        None
 - 
        None
 - 
        None
 
I have a few collections that I need to read in, I've tried following the documentation on how to set ReadConfig, but I'm still coming up blank.
I've attached a Scala object, where I try connecting to 2 different collections. MongoSpark ignores my read config in the second instance, and continues to read from the first collection.
I've spent about 4 hours trying to figure this out. I've tried many permutations, which all yield errors.
The attached are with:
- Mongo settings NOT specified in spark-defaults.conf
 - No other --conf options passed to my Spark master/driver