I am developing an API in which I need the paths of multiple files from a remote server. The number of files varies from 100 - 500. The files are in different folders.
So, I am looping over and over again, like 10-50(depending on number of files) times, and then getting paths using ssh in my python api.
But I want an optimized solution for this problem. Right now, I am making ssh connection as many times as loop goes, which is slow, and also not the best thing to do.
I was thinking of copying the /var/lib/mlocate/mlocate.db
of remote server daily in my local machine and then find the path using locate
command using this db, If that is possible. OR Like storing the remote directory index in my local machine which I can query more fastly.
What are other better ways to achieve this?