[DOCS-10878] Spark Connector 2.2 Java Guide shows prerequisite of Spark 2.1.x, but should be 2.2.x Created: 09/Oct/17 Updated: 27/Oct/23 Resolved: 02/Jan/19 |
|
| Status: | Closed |
| Project: | Documentation |
| Component/s: | Spark Connector |
| Affects Version/s: | None |
| Fix Version/s: | None |
| Type: | Bug | Priority: | Major - P3 |
| Reporter: | Roger McCoy (Inactive) | Assignee: | Jonathan DeStefano |
| Resolution: | Gone away | Votes: | 0 |
| Labels: | None | ||
| Remaining Estimate: | Not Specified | ||
| Time Spent: | Not Specified | ||
| Original Estimate: | Not Specified | ||
| Participants: | |
| Days since reply: | 5 years, 6 weeks ago |
| Description |
|
In this 2.2 documentation: The prerequisite Spark version should be 2.2.x. My understanding is the MongoDB Spark Connector versions should only be used with the corresponding Spark versions, as listed here: |
| Comments |
| Comment by Jonathan DeStefano [ 02/Jan/19 ] |
|
This was fixed when a subsequent Spark Connector version was released. |