Elasticsearch has a plethora of built-in analyzers, as well as the ability to automatically detect the data type for each of your fields. If you want to fine-tune how your fields are analyzed and mapped, then Elasticsearch has you covered. In this hands-on lab, you will create and use custom analyzers and multi-fields, and define both explicit and dynamic mappings in Elasticsearch.
Learning Objectives
Successfully complete this lab by achieving the following learning objectives:
- Create the strings_as_keywords Component Template
From the Kibana console, create the
strings_as_keywords
component template with the following requirements:- Dynamically maps
string
fields as thekeyword
datatype. - Only indexes the first
256
characters.
- Dynamically maps
- Create the social_media_analyzer Component Template
From the Kibana console, create the
social_media_analyzer
component template with the following requirements:- Creates a custom analyzer called
social_media
that uses theclassic
tokenizer. - Uses the
lowercase
filter and a customenglish_stop
filter. Theenglish_stop
filter should remove English stop words. - Uses a custom
emoticons
mapping character filter with the following mappings::)
ashappy
:D
aslaughing
:(
assad
:')
ascrying
:O
assurprised
;)
aswinking
- Creates a custom analyzer called
- Create the twitter_template Index Template
From the Kibana console, create the
twitter_template
index template with the following requirements:- Matches all indices that start with "twitter-".
- Composed of the
strings_as_keywords
andsocial_media_analyzer
component templates. - Maps the
tweet
field as an analyzed string field with thesocial_media
analyzer. - Maps a
tweet.keyword
multi-field as typekeyword
and only indexes the first280
characters. - Sets the number of shards to
1
and replicas to0
.