I've got an elasticsearch instance, with an Nginx reverse proxy sat in front of it, implementing a URL level access control mechanism - approximately as described in this article
This is giving me per-index granularity, which is handy.
What I'm trying to figure out next though is - can I enforce a blanket filter criteria that excludes results from searches and direct document retrievals?
I'm thinking a 'group' model, where docs in my indices are tagged by group, and users can only see results and docs for their group.
Now, I know I could do a per-group index, and apply URL level controls. That may be my workaround if I can't do this.
I have had a look at aliasing - this seems to do 90% of what I want, in that I can restrict a search to an alias. But what I can't then do is inhibit a direct 'GET' request with an (unauthorised) document ID.
Is there a way of doing this, or am I just on a road to nowhere?
Note - part of my reasons for this, is I'm trying to use a fairly standard kibana setup, and I have overlapping groups of users.
While I can't directly answer your question (+1), I wanted to point out that the people working at elastic finally listened to all the requests made by people demanding access control for
Elasticsearch
and introduced Shield. Quoting the website:Maybe having a look at least to check if this fulfills your requirements would be worth.
While starting a new project today at work with Elasticsearch I did some research and found Search Guard - Elasticsearch security for free. Obviously I can't judge (yet) how good this works but wanted to leave a pointer here in case you or others are still searching for (a) solution(s), and in case Shield can't be used, for whatever reasons.
Quoting the website, these are the features:
But, there are limitations as well:
Maybe this is helpful and of value for you or someone else, who comes across this post.