[DOCS-15666] [Spark] Description of giving partitioner options via SparkConf is not right Created: 04/Oct/22 Updated: 29/Oct/23 Resolved: 07/Oct/22 |
|
| Status: | Closed |
| Project: | Documentation |
| Component/s: | Spark Connector |
| Affects Version/s: | None |
| Fix Version/s: | None |
| Type: | Typo | Priority: | Major - P3 |
| Reporter: | Mehdi El Hajami | Assignee: | Christopher Cho |
| Resolution: | Fixed | Votes: | 0 |
| Labels: | spark | ||
| Remaining Estimate: | Not Specified | ||
| Time Spent: | Not Specified | ||
| Original Estimate: | Not Specified | ||
| Attachments: |
|
| Participants: | |
| Days since reply: | 1 year, 17 weeks, 5 days ago |
| Epic Link: | DOCSP-20538 |
| Story Points: | 1 |
| Description |
|
Hello, In the Mongo Spark Connector documentation, there seems to be a mistake. We're asked to use spark.mongodb.read.partitionerOptions instead of partitioner.options in case we give configuration via Spark. https://www.mongodb.com/docs/spark-connector/current/configuration/read/
So for example:
But this seems to be wrong looking at the code, it should be : |
| Comments |
| Comment by Christopher Cho [ 07/Oct/22 ] |
|
Thanks for reporting this issue mehdi.elhajami.pro@gmail.com, we've updated the documentation to include the fix! |
| Comment by Christopher Cho [ 07/Oct/22 ] |