I have a 64-bit Windows 2003 server with 48 CPU cores and 128GB of RAM running a single application (SQL 2008 Analysis Services). SSAS is currently using about 50GB of memory.
SSAS relies to a large extent on the Windows File Cache to hold frequently used data (see this article, for example). On my server, the windows file cache is normally in the 10-15GB range, but occasionally it will suddenly drop to 5-6GB (Memory\System Cache Resident Bytes
). When this occurs, all the SSAS page reads have to go to disk, and queries start timing out until the file cache repopulates.
I have a second server (only 24 cores, but otherwise near-identical) that doesn't exhibit the same symptoms, even running the exact same SSAS instance, the same queries, and the same load (this is a load-balanced environment).
I've asked a detailed SSAS-focused question on dba.stackexchange.com, but a few questions about the Windows behavior:
Is there a way to know why the SSAS database file is getting flushed out of the cache?
Can I pre-populate the file cache or actively manage it in some way? (I'm not sure that using a RAMDRIVE is an option in our environment)
At one point, we did change the network optimization setting on the server to "Maximize data throughput for file sharing" to match the other box, but that hasn't seemed to make a significant difference.
Edit: Added bounty. If we can't answer "why", then perhaps just a way to better understand what processes are currently using the cache, or what files are in it, or something that might lead us in the right direction.