Custom Analyzers: Token Filters
Explore the most commonly used built-in token filters.
Overview
Token filters are used as analysis components to process tokens generated by a tokenizer. They are responsible for adding, removing, and altering tokens in the token stream. Elasticsearch offers a wide range of token filters, which can be broadly categorized into three types:
-
Normalization filters
-
Stemming filters
-
Miscellaneous filters
Get hands-on with 1200+ tech skills courses.