I need something like Cloud Storage for Firebase: download metadata of all files, just not in Angular but in Python and just for a chosen file instead.
The aim is to return this information when the Cloud Function finishes with the return
statement or just to log it during the run of the Cloud Function as soon as the file is saved in the Google Storage bucket. With that information at hand, another job can be started after the given timestamp. The pipeline is synchronous.
I have found Q/A's on loading a file or its data into the Cloud Function
- using
/tmp
directory like at Reading Data From Cloud Storage Via Cloud Functions - or load the data (and not the file) using storage.Client() or pandas df.read_csv() like at How to load a file from google cloud storage to google cloud function
to extract data stats into the running Cloud Function from the external file.
Since I do not want to save the large file or its data in memory at any time only to get some metadata, I want to download only the metadata from that file that is stored in a bucket in Google Storage, meaning timestamp and size.
How can I fetch only the metadata of a csv file in a Google Cloud Storage bucket to the Google Cloud Function?
There is a Google document present, that shows how to get metadata which is similar to the GitHub Link that I had provided in the comment. You can look at the library here
It just gets the metadata and doesn’t retrieve object data until you call download_to_filename()
Else, you can have a look at the API:get documentation where it shows that it only retrieves the metadata if
alt = media
isn’t specified and try it.