-
Type: Bug
-
Resolution: Declined
-
Priority: Major - P3
-
None
-
Affects Version/s: None
-
Component/s: Spark Connector
-
Labels:
scala> val df = MongoSpark.load(sparkSession) <console>:23: error: not found: value MongoSpark val df = MongoSpark.load(sparkSession)
I tried following this step by step, but I got an error here, this is the command you listed here, why it is not working?
Can you give me an update on this?