[SERVER-59683] Make LRU cache utility more generic so that it can limit its size by bytes Created: 31/Aug/21 Updated: 29/Oct/23 Resolved: 17/Sep/21 |
|
| Status: | Closed |
| Project: | Core Server |
| Component/s: | None |
| Affects Version/s: | None |
| Fix Version/s: | 5.1.0-rc0 |
| Type: | Task | Priority: | Major - P3 |
| Reporter: | Anton Korshunov | Assignee: | Alexander Ignatyev |
| Resolution: | Fixed | Votes: | 0 |
| Labels: | None | ||
| Remaining Estimate: | Not Specified | ||
| Time Spent: | Not Specified | ||
| Original Estimate: | Not Specified | ||
| Backwards Compatibility: | Fully Compatible |
| Sprint: | QO 2021-09-20 |
| Participants: |
| Description |
|
The base PlanCache implementation uses the LRUKeyValue store for caching plans. This class implements a least recently used (LRU) eviction policy based on the number of entries in the cache, which is set upon construction. This is fine for the classic PlanCache but in SBE the eviction policy should be based on memory consumption rather than on a number of entries. That is, we should be able to track a total amount of memory occupied by the plan cache entries and start eviction when it reaches a pre-defined limit set via a server configuration parameter. Given that we're looking to reuse this LRU store both for the classic and SBE engines, we need to abstract out the logic dealing with the cache size into a separate interface with two implementations - one would compute the size based on the number of cache entries, another one based on the total size of the entries - and refactor LRUKeyValue to support this new interface. |
| Comments |
| Comment by Vivian Ge (Inactive) [ 06/Oct/21 ] |
|
Updating the fixversion since branching activities occurred yesterday. This ticket will be in rc0 when it’s been triggered. For more active release information, please keep an eye on #server-release. Thank you! |
| Comment by Githook User [ 17/Sep/21 ] |
|
Author: {'name': 'Alexander Ignatyev', 'email': 'alexander.ignatyev@mongodb.com', 'username': 'aligusnet'}Message: |