tag:blogger.com,1999:blog-3689647052791307812.post8511388716915632295..comments2024-01-27T22:41:04.808+11:00Comments on Richard Lees on BI: Why doesn't SSAS cache the entire cube?Richard Leeshttp://www.blogger.com/profile/05671716466559973540noreply@blogger.comBlogger17125tag:blogger.com,1999:blog-3689647052791307812.post-39560049982115319742015-03-12T17:13:27.898+11:002015-03-12T17:13:27.898+11:00Thanks Acius,
That adds more information to the is...Thanks Acius,<br />That adds more information to the issue. I will look into it when I get a few free minutes. However, I'm not confident that it will be a good workaround. It appears that it fixes the issue of the file cache taking too much memory. My experience has been with lots of free RAM and SSAS + file cache not using it. Ideally, SSAS would have a good cache (like SQL Server) to hold cells and aggregation in a cache limited on size. SSAS also relies on Windows file cache to hold data in memory. So there is an overlap, which would add inefficiencies. I wouldn't mind so much if Windows file cache stored all the cube data files so all SSAS file calls were to Windows cache. But this isn't my experience.<br />Perhaps the SetSystemFileCacheSize API could be used to set a high limit on the file cache.<br /><br />Thank you for the contribution. It does get us closer to a solution.Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-3689647052791307812.post-14901196210738637562015-03-12T16:00:40.753+11:002015-03-12T16:00:40.753+11:00Hi Richard,
We raised a support request for this o...Hi Richard,<br />We raised a support request for this one a few years ago and got nowhere after spending a lot of effort proving what you illustrate here.<br />We ended up taking the workaround on the connect entry .. using solid state and overdoing memory.<br />Unfortunately the overdoing isn't overdone anymore and the machine can't take more memory.<br /><br />Trying to go lateral..<br />http://support.microsoft.com/kb/976618<br /><br />use the GetSystemFileCacheSize API function and the SetSystemFileCacheSize API function to set the maximum or minimum size value for the working sets of the system file cache.<br /><br />Seems to work in a development environment.<br /><br />Any thoughts?<br /><br />WAciushttps://www.blogger.com/profile/14892911925466902782noreply@blogger.comtag:blogger.com,1999:blog-3689647052791307812.post-43646817760205823192014-04-10T22:14:24.002+10:002014-04-10T22:14:24.002+10:00Hi Richard,
Recently, I stumbled across a section...Hi Richard,<br /><br />Recently, I stumbled across a section explaining the economic memory management model (in MSFT SQL Server 2008 Analysis Services Unleashed) and immediately thought of this post ;-)<br /><br />(According to the text) regardless of memory utilization (relative to the defined low/total/hard limits) cleaner threads will still activate on a recurring basis, traverse the list of "memory holders" and send out requests to give up some shrinkable memory. Upon receiving a request, the "memory holder" will calculate the "tax" it owes and pay up (read: release shrinkable memory). I think this may be partly responsible for the churn you're seeing.<br /><br />So unless you disable the cleaning threads entirely (which I'm not sure is possible) I don't think you can cache an entire cube - assuming you are also able to eliminate the arbitrary shaped-sets issue that Chris wrote about some time ago. <br /><br />Perhaps you might try messing with the <br />"LimitSystemFileCachePeriod" setting in the msmdsrv.ini file. The default value is 1000 which I can only assume is in milliseconds - but really I have no idea and haven't ever messed with this property.Billhttps://www.blogger.com/profile/12771255310034087052noreply@blogger.comtag:blogger.com,1999:blog-3689647052791307812.post-37213791417886221342014-04-01T18:14:38.753+11:002014-04-01T18:14:38.753+11:00Hi Scott,
This is absolutely a feature of SSAS. ...Hi Scott, <br />This is absolutely a feature of SSAS. I have had many super SSAS technicians ask me similar questions with incredulity that the memory caching in SSAS could be so poor in an X64 environment. I have spent the time to convince these technicians of my findings and got them to reproduce it on occasion. <br /><br />So I apologise to you for this brief answer, satisfy it to say that this is the behaviour of SSAS, and I would very much like to see it improved.Richard Leeshttps://www.blogger.com/profile/05671716466559973540noreply@blogger.comtag:blogger.com,1999:blog-3689647052791307812.post-20853986713738322592014-04-01T16:40:18.960+11:002014-04-01T16:40:18.960+11:00Richard,
Continued from prior post ...
- Can yo...Richard,<br /><br />Continued from prior post ...<br /><br />- Can you confirm that no other SSAS memory configuration settings are being explicitly used besides the three names stated above? It appears that there are other memory "knobs" to play with, but the docs say don't mess with them without specific guidance from Microsoft. I am guessing that you haven't set any other memory configuration values, but I wanted to specifically ask and close that loop.<br /><br />- Can you confirm that you are using the x64 64-bit platform versions of Windows Server, SSAS, and supported server hardware / VMs? I know this is probably a dumb question, as you spoke of 64-bit in general terms in your post, but I'm asking to close this loop as you never specifically said the hardware / software platform status for each of these components in your post. Hopefully easy to confirm, put behind, and eliminate as a possible root cause source.<br /><br />When you mentioned that 2-3 GB cubes did not seem to have these issues, it also got me to thinking about 2-3 GB memory limits from 32-bit platforms of days gone by, and wondered if any of the installed hardware / software might be of a different platform than expected or assumed.<br /><br /><br />I look forward to your feedback and a better understanding your "puzzle". Thanks again for sharing your experiences. We all learn from these efforts.<br /><br /><br />Scott R.<br />Scott R.https://www.blogger.com/profile/02891152021007281852noreply@blogger.comtag:blogger.com,1999:blog-3689647052791307812.post-76535742160490400012014-04-01T16:36:58.168+11:002014-04-01T16:36:58.168+11:00Richard,
Sorry for being late to the game. I ju...Richard,<br /><br />Sorry for being late to the game. I just read your Connect item which linked to this post.<br /><br />Your issue sounds most frustrating - I feel your pain.<br /><br />I am no SSAS expert, but your issue was intriguing, so I have a few questions for you (mainly to clarify the problem statement in your original post):<br /><br />- Can you please share the exact numerical values you use for your SSAS LowMemoryLimit, TotalMemoryLimit, and HardMemoryLimit configuration values (if you have specified any explicit values to override the default values), and which ones are not explicitly set and rely on the default values? SSAS Books Online says the defaults are 65(%), 80(%), and 0 (zero - use the mid-point between low and total, or 72-73%), respectively.<br /><br />I understand that your post states:<br />- 36 GB dedicated server<br />- 11 GB cube<br />- 65% LowMemory(Limit?) is mentioned - not sure if that is explicitly set or a default value. If in effect, it should result in an actual low memory limit value in effect of 23.4 GB (65% of 36 GB).<br />- No mention of Total or Hard memory limit configuration settings (explicit or default) - possibly what was meant by "default configuration setting" at the start of one paragraph in your post.<br /><br />After reading an interesting 2-post series on SSAS memory configuration and a specific performance problem (see links: http://greg.blogs.sqlsentry.net/2009/06/analysis-services-memory-limits.html and http://greg.blogs.sqlsentry.net/2009/06/analysis-services-memory-limits-part-ii.html), I learned that these SSAS memory configuration values represent a percentage of server memory if they are <= 100, or represent bytes of memory if they are > 100. Yes - that is bytes, and not KB, MB, GB, or any other storage unit measures! It surprised me too! The root cause in that post series was use of explicit SSAS memory configuration values that assumed KB instead of bytes, resulting in inadequate memory for SSAS and the source of the performance problems.<br /><br />Is it possible that you explicitly set one or more of these SSAS memory configuration values thinking that the unit measure was KB, MB, or GB, but SSAS is expecting bytes and the setting is way lower than what you had thought?<br /><br />As an example: Say you had manually calculated 65% of 36 GB as 23.4 GB, truncated it to 23, and set LowMemoryLimit to 23 (thinking it was GB for the storage unit measure). SSAS would have interpreted the 23 as a percentage of 36 GB (and not as 23 GB), resulting in a configured low memory limit value of 8.3 GB (23% of 36 GB) - less than the 11 GB needed by your cube.<br /><br />Another alternative to try: set each of the three memory limit configuration values to their appropriate byte values, based on manual calculations of the 65%, 80%, and 73% default percentage values. The resulting memory limit values in bytes should be:<br /><br /> * 25,125,558,681 bytes: 23.4 GB - 65% of 36 GB<br /> * 30,923,764,531 bytes: 28.8 GB - 80% of 36 GB<br /> * 28,239,409,971 bytes: 26.3 GB - 73% of 36 GB<br /><br />I'm guessing that the memory configuration values in bytes should not include comma thousands separators. I only include the comma thousands separators here for readability.<br /><br />The premise of explicitly setting the memory limit byte values is if the percentage values result in miscalculated memory limit values in bytes (arithmetic overflow, etc.). Just a theory.<br /><br />I hit the reply size limit for your blog, so I will break this into a separate reply.<br /><br /><br />Scott R.Scott R.https://www.blogger.com/profile/02891152021007281852noreply@blogger.comtag:blogger.com,1999:blog-3689647052791307812.post-38291980358003489752013-05-26T08:04:30.760+10:002013-05-26T08:04:30.760+10:00Richard,
thank you very much for your post. I fac...Richard,<br /><br />thank you very much for your post. I faced exactly with the same problem as you. Moreover, I have a MDX calc which is intensively running through a measure group, it looks like this (a part of it):<br /><br />FILTER<br />(<br />NonEmpty(<br />IIF (<br />[ТоргТочки].[ТоргТочки_Id].CurrentMember.Level is [ТоргТочки].[ТоргТочки_Id].[(All)],<br />EXISTS( [ТоргТочки].[ТоргТочки_Id].[ТоргТочки_Id].Members, <> ),<br />existing [ТоргТочки].[ТоргТочки_Id].[ТоргТочки_Id].Members<br />)<br />,<br />[Measures].[РезультатN_Count]<br />)<br />,<br />[Measures].[РезультатN_Sum]> 0 and [Measures].[Оборот] > 0<br />)<br /><br /><br />this measuregroup has size of 500000 rows and 50 MB on disk (if we look at files in DATA folder).<br /><br />And during execution of this request I see IO load made by msmdsrv.<br /><br />Question: is it so difficult for analysis services to load the 50MB totally into memory BEFORE executing the request? Instead, it is making a lot of reads from disk...<br /><br />I tried 5 different RAM drive software, they did not give any result. This is because they all have the same problem: with big files sequential copying from ram-disk to ram-disk the speed is fine (500MB/sec), but when you will be copying a set of small files (e.g. AS database set of files) from ram-disk to ram-disk, you will get the speed of 50Mb/sec.<br />At least, I've got such a result.<br /><br />So, if someone knows ram-disk software which works good with small files, we all be grateful if you name it here.Ihor Bobakhttps://www.blogger.com/profile/01215022839141701577noreply@blogger.comtag:blogger.com,1999:blog-3689647052791307812.post-49121006520445615722013-05-26T08:02:41.527+10:002013-05-26T08:02:41.527+10:00This comment has been removed by the author.Ihor Bobakhttps://www.blogger.com/profile/01215022839141701577noreply@blogger.comtag:blogger.com,1999:blog-3689647052791307812.post-82597807090011339942012-11-13T08:16:55.334+11:002012-11-13T08:16:55.334+11:00Have you looked at amd.com/ramdrive? I'm on a ...Have you looked at amd.com/ramdrive? I'm on a SSD I7 with 16gb ram, and I'm just DYING for faster performance on these SSRS / SSAS subreports.. <br /><br />I've been using this RAMDRIVE product for about a month with ABSOLUTELY ZERO PROBLEMS, I'd love it if someone could help me to benchmark it :)<br /><br />PS - I know this sounds ridiculous.. but I honestly still use SQL 2000 (on Win7) for development, and then I upsize to 2008 R2 :)Anonymoushttps://www.blogger.com/profile/05584287191450352593noreply@blogger.comtag:blogger.com,1999:blog-3689647052791307812.post-52948790696407511122012-11-13T08:16:27.123+11:002012-11-13T08:16:27.123+11:00Have you looked at amd.com/ramdrive? I'm on a ...Have you looked at amd.com/ramdrive? I'm on a SSD I7 with 16gb ram, and I'm just DYING for faster performance on these SSRS / SSAS subreports.. <br /><br />I've been using this RAMDRIVE product for about a month with ABSOLUTELY ZERO PROBLEMS, I'd love it if someone could help me to benchmark it :)<br /><br />PS - I know this sounds ridiculous.. but I honestly still use SQL 2000 (on Win7) for development, and then I upsize to 2008 R2 :)Anonymoushttps://www.blogger.com/profile/05584287191450352593noreply@blogger.comtag:blogger.com,1999:blog-3689647052791307812.post-74779083313048703642012-08-29T18:11:17.122+10:002012-08-29T18:11:17.122+10:00If you want to see Analysis Services cache manager...If you want to see Analysis Services cache manager improved to cache more of the cube in memory vote at https://connect.microsoft.com/SQLServer/feedback/details/760107/multidimensional-cube-is-not-retained-in-memory-even-though-there-is-plenty-of-ram Richard Leeshttps://www.blogger.com/profile/05671716466559973540noreply@blogger.comtag:blogger.com,1999:blog-3689647052791307812.post-29174691540192922862012-08-23T16:57:16.014+10:002012-08-23T16:57:16.014+10:00Thanks Richard,
Much appreciated as this is a con...Thanks Richard,<br /><br />Much appreciated as this is a constant concern for us..<br /><br /><br />CheersJonimatixhttps://www.blogger.com/profile/11851198363352701936noreply@blogger.comtag:blogger.com,1999:blog-3689647052791307812.post-59165547231522127342011-12-13T07:09:16.691+11:002011-12-13T07:09:16.691+11:00Hi Chris,
I think the shaped sets have something t...Hi Chris,<br />I think the shaped sets have something to do with it. But I would like an 11GB cube to be cached in a 36GB server.Richard Leeshttps://www.blogger.com/profile/05671716466559973540noreply@blogger.comtag:blogger.com,1999:blog-3689647052791307812.post-86906337476293283662011-12-12T20:14:40.452+11:002011-12-12T20:14:40.452+11:00Hi Richard,
I tried to leave a comment yesterday ...Hi Richard,<br /><br />I tried to leave a comment yesterday but it hasn't showed up...? Anyway, I was going to suggest that you're running into the problem I describe here: http://cwebbbi.wordpress.com/2009/02/04/arbitrary-shaped-sets-and-the-storage-engine-cache/ <br /><br />ChrisAnonymousnoreply@blogger.comtag:blogger.com,1999:blog-3689647052791307812.post-3570980123255087522011-12-12T10:29:02.229+11:002011-12-12T10:29:02.229+11:00Hi Unknown,
I was tracking every Permon counter. ...Hi Unknown,<br />I was tracking every Permon counter. I only showed the salient counters in the picture above, or it would be quite unreadable. The system was not paging. It had free memory all day, and the paging file is on the c: drive.<br />Why don't you try and reproduce the same results? All you need is an 11GB cube and a server with, at least 2-3 times the RAM.Richard Leeshttps://www.blogger.com/profile/05671716466559973540noreply@blogger.comtag:blogger.com,1999:blog-3689647052791307812.post-46031371960619592572011-12-12T10:10:11.039+11:002011-12-12T10:10:11.039+11:00Richard, this is really strange to me. Are you sur...Richard, this is really strange to me. Are you sure that your server is not paging?<br />You should track much more performance counter, including private bytes from SSAS, SSAS counters (especially memory) and system memory counters (paging in particular).Marco Russohttps://www.blogger.com/profile/03947004727803583866noreply@blogger.comtag:blogger.com,1999:blog-3689647052791307812.post-57248485556075597062011-12-12T10:09:42.278+11:002011-12-12T10:09:42.278+11:00This comment has been removed by the author.Marco Russohttps://www.blogger.com/profile/03947004727803583866noreply@blogger.com