I have are large-ish database which consists of 100+ tables. A few of these tables are so big that running VACUUM on them takes a couple of hours, so I don't want to run it on these tables. Is there any way to run VACUUM with a dynamic list of tables, i.e. something like below?
VACUUM (SELECT table_name FROM information_schema.tables WHERE table_name not in ['list', 'of', 'big', 'tables'])
It would of course be possible to just list all the table names manually, but ideally I would like to have a way to do it dynamically.
This may actually be counter-productive.
You say that the VACUUM takes hours but does it actually do anything? Does it remove any tuples from the table?
If there's a lots of transactional updates made to this table then the "dead" rows have to be cleared away some time (by VACUUM'ing). If you don't, then the performance of your Application will start to go rapidly downhill as PostgreSQL has to navigate through ever-increasing amounts of dead rows.
Which version of PostgreSQL are you running? You may find that it is doing the VACUUM'ing for you, automatically.
If you still want to do this, then write a query to "select" the VACUUM statements, spool them into a file and run the file as SQL: