Elasticsearch, if not configured correctly, might result in hackers accessing the instance and delete the indexes, even all of them. The community version doesn’t come with security features, so it is important to configure Elasticsearch in such a way, that can prevent deletion attacks on indexes.
The following configuration options are recommended to secure your Elasticsearch instance of a public cloud platform:
- Open Elasticsearch configuration file:
# vi /etc/elasticsearch/elasticsearch.yml
Change network.host from 0.0.0.0 to 127.0.0.1, as follows:
network.host: 127.0.0.1This will prevent access towards Elasticsearch instance from outside world, and thus prevent rest api calls towards it for executing delete command.
- Use a custom port instead of 9200:
Change http.port from 9200 to 9909 (or some other custom port), as follows:
http.port: 9909This will prevent automated scanner scripts that scan for online resources running Elasticsearch instance on default port 9200.
- Enable the option: “Require explicit names when deleting indices” in Elasticsearch configuration file. This will prevent deletion of indexes in Elasticsearch from base/parent name, the index name must be known to fire a delete command.
Change action.destructive_requires_name from false to true, as follows:
Save /etc/elasticsearch/elasticsearch.yml and restart Elasticsearch service.
# systemctl restart elasticsearch.service
Configuring security in Elasticsearch is most important when the instance is hosted on a public cloud, as hackers can access your instance and delete all the indexes. Deletion of indexes in Elasticsearch can be secured by above mentioned points. Further, some Apache or nginx rules can also be enforced as to what kind of requests can go through the Web server. Like, we can think of allowing GET, POST requests, but deny DELETE, PUT, OPTIONS etc.