What are the ways to improve performance of the following procedure:
Every day I take a datafile, bcp it into an intermediate table without indexes (so that data import is as fast as possible), and then move it to a second table which has a lot of complex indexes which guarantee data validity. Number of rows inserted each day is like 1 million.
Data retention on the intermediate table does not matter, so I was thinking about using in-memory table (which are unfortunately available only in 15.5) or disabling transaction log(is it possible to just 1 table?), or something else...
Any clues?
Use a (nonshareable) temporary table?