Description
this is my settings:
{
"settings": {
"index": {
"analysis": {
"analyzer": {
"custom_analyzer": {
"type": "custom",
"tokenizer": "ik_max_word",
"filter": ["synonym_filter"]
}
},
"filter": {
"synonym_filter": {
"type": "synonym",
"synonyms_path": "analysis/synonym.dic"
}
}
}
}
},
"mappings": {
"properties": {
"question": {"type": "text", "analyzer": "custom_analyzer", "search_analyzer": "custom_analyzer"},
"answer": {"type": "text"},
"file_id": {"type": "text"}
}
}
}
then I used the analyzer ”custom_analyzer“ in tokenizer function:
def tokenization(self, question):
body = {"text": question, "analyzer": 'custom_analyzer'}
tokens = self.es.indices.analyze(index=self.index_name, body=body)
return [token["token"] for token in tokens["tokens"]]
I got the errors:
ERROR - Request error: BadRequestError(400, 'illegal_argument_exception', 'failed to find analyzer [custom_analyzer]')
how to solve this problem