[SERVER-10842] Implement an interface for tokenizer Created: 22/Sep/13 Updated: 25/Jun/15 Resolved: 18/Mar/15 |
|
| Status: | Closed |
| Project: | Core Server |
| Component/s: | Text Search |
| Affects Version/s: | None |
| Fix Version/s: | None |
| Type: | Improvement | Priority: | Minor - P4 |
| Reporter: | ShiLei | Assignee: | Unassigned |
| Resolution: | Duplicate | Votes: | 0 |
| Labels: | None | ||
| Remaining Estimate: | Not Specified | ||
| Time Spent: | Not Specified | ||
| Original Estimate: | Not Specified | ||
| Issue Links: |
|
||||||||
| Participants: | |||||||||
| Description |
|
It seems impossible to Implement an universal tokenizer. Why don't we build up a tokenizer interface. If so, users will enable to integrate mongoDB with any production-ready tokenizer(even specify tokenizer by each language). |
| Comments |
| Comment by J Rassi [ 18/Mar/15 ] |
|
Closing as a duplicate of |