[DOCS-12259] [Spark] Explain how to write your own Spark Partitioner Created: 22/Aug/18 Updated: 25/Aug/23 Resolved: 25/May/21 |
|
| Status: | Closed |
| Project: | Documentation |
| Component/s: | Spark Connector |
| Affects Version/s: | None |
| Fix Version/s: | None |
| Type: | Improvement | Priority: | Major - P3 |
| Reporter: | Bryan Reinero | Assignee: | Unassigned |
| Resolution: | Declined | Votes: | 0 |
| Labels: | None | ||
| Remaining Estimate: | Not Specified | ||
| Time Spent: | Not Specified | ||
| Original Estimate: | Not Specified | ||
| Issue Links: |
|
||||||||
| Participants: | |||||||||
| Days since reply: | 2 years, 37 weeks, 1 day ago | ||||||||
| Epic Link: | DOCSP-6205 | ||||||||
| Description |
|
Customers such as Spireon Wrote their own custom partitioner for the Spark Connector, as they found the out-of-box partitioners were too slow for theirĀ 30 Terabytes collection. The ability for users to create their own partitioners is a beneficial feature of the Spark Connector which we should document for our users. |
| Comments |
| Comment by Anthony Sansone (Inactive) [ 25/May/21 ] |
|
This ticket has been closed due to age and inactivity. Please file a new ticket with recent details if needed. Thank you. |
| Comment by Ross Lawley [ 23/Oct/18 ] |
|
Document limitations with current partitioners and multiple default key types: |