I have a single instance of SQL Server 2008 Analysis Services running on Windows Server 2008 x64 with 8GB of RAM on the server. All of the settings for memory are left to their defaults.
When the SSAS service starts, it behaves normally. However, whenever a query is run against the instance, the SSAS service continually will consume all of the memory on the server, to the point that the server needs to be rebooted. At present, the TotalMemoryLimit and LowMemoryLimit settings are set to 80% and 75% respectively. The HardMemoryLimit is set to 0 by default.
Why is SSAS using all of the memory on the server, despite the limit settings? Does HardMemoryLimit need to be set?
From technet SQL Server 2005 Analysis Services (SSAS) Server Properties
In short, TotalMemoryLimit can be passed up during heavy loads.
From SSASPerfGuide:
Also most people run SSAS on the box that has SQL. Remember to keep 1.5-2GB for OS and light services, set and then add the SQL memory minimum in there (1GB? 2?) plus about .5GB for SQL overhead (memory sets the buffer cache size, not all the memory that sql uses). Now set the Memory limit percentage based on that. 100-(100*(os+sql)/total).
If you have 4gb used for SQL and os, then set the TotalMemoryLimit at 50 (100-(100*4/8)). Be aware of the temp file issue... if it grabbed all the memory on your box then full-cube processing may really want that much working space!
I would check the perfmon counters to confirm what is exactly using the memory - It is not clear if you are using task manager to identify the use. Analysis Service should peak out its memory at the total memory limit. Do the queries continue to run? - why do you have to reboot? is it to kill the ongoing queries? I would check the cleaner counters to see if the issue is there - ultimately you may just need to add more memory for the queries you are running or take a look at your cube designs