The current implementation of index filters allows for granular control as they are set and matched against the full query shape:
Index filters determine which indexes the optimizer evaluates for a query shape. A query shape consists of a combination of query, sort, and projection specifications. If an index filter exists for a given query shape, the optimizer only considers those indexes specified in the filter.
This is very nice as it allows for quite a bit of flexibility. However, this matching pattern does not work well for all use cases. Consider an application that allows the customer to dynamically select projections. If index filters are needed in that environment, then it may not be feasible to try to determine and manage them based on the modifiable projections.
While the default matching level should remain the same, we could consider introducing a tunable parameter for relaxing it. Using the example above, the option could be set to a mode that instructs the optimizer to only consider the query predicates and sort fields (eg: not consider projections) when checking for matches against index filters.