{
"_source": {
"enabled": false
},
"analysis": {
"analyzer": {
"default": {
"type": "custom",
"tokenizer": "uax_url_email",
"filter": "lowercase,standard,stop"
}
}
},
"mappings": {
"table": {
"properties": {
"field1": {
"type": "string",
"include_in_all": false,
"index": "no"
},
"field2": {
"type": "long",
"include_in_all": false,
"index": "no"
},
"field3": {
"type": "string",
"index": "analyzed"
}
}
}
}
}
The analyzer doesn't seem to work when testing it. The analyzer should not index stop words and it should also index an email address as a whole. When I "TEST ANALYZER" and type "Jack is fine", indexing of all three words takes place. I do not want it to index the stopwords in english language such as "and","is" etc.
You set the fields to be "index": "no" and also disable
include_in_all
. How do you expect to have something put in the index? Quote from the documentation:And the actual
settings
should be like this (you are missing"settings"
from your index definition):