We have 8 REST-ish API servers with Nginx using FastCGI with PHP-FPM to handle requests. We're currently using Nginx' FastCGI caching (directives like fastcgi_cache_path
). This means that API responses are cached, but there is a separate cache for each server.
Is there a good way to share cache storage among all eight servers?
We have considered using Redis as shared storage, but the modules available seem to require application changes. In some cases, we may wish to cache responses outside of our control (over HTTP to external APIs). Ideally, a drop-in replacement for the Nginx built-in caching of FastCGI and HTTP responses would be available.
There seems to be a rather new blog post at https://www.nginx.com/blog/shared-caches-nginx-plus-cache-clusters-part-1/ about this issue. This first example could be usefull if you run more than two nginx cache servers.
Especially the second part of the post sounds interesting for my current use-case, where I want to automatically re-crawl changed items into my caches.
It should work with the open source Version of NGINX too. Basically it works by proxying the request cascaded through each NGINX Server (Nginx-Cache1->NGinx-Cache2->Origin-Server) each server caches from the relevant upstream and if desired it is possible to create a HA Cluster too. https://www.nginx.com/blog/shared-caches-nginx-plus-cache-clusters-part-2/