http://www.linkedin.com/pulse/some-common-issues-errorsand-resolutions-while-ssas-cube-kp-maddineni?articleId=6197001715720519680#comments-6197001715720519680&trk=prof-post
Problem
What are the different methods of dealing with errors in SQL Server Analysis Services (SSAS) processing and should change the default options? What are the SSAS error processing options for a cube, a partition, or a dimension? How can you change the SSAS error processing configuration?
Solution
SQL Server Analysis Services (SSAS) actually offers an array of various error handling techniques for common issues that surface when processing a cube, a partition, or a dimension. These properties allow you to set various error number thresholds for stopping processing while at the same time telling SSAS how to handle specific errors that occur. As with many features, you need to be careful to use these options appropriately and with full knowledge of the impact on your data. Many of the options often would violate foreign key constraint issues that normally would surface in a normalized OLTP database. To the contrary, the error processing options provide a great way for cube processing to continue when only minor or immaterial errors occur during cube processing which in turn provides for great "up and running" time for the cubes themselves.
Setting the SSAS ErrorConfiguration Properties
As shown in the next illustration the default error configuration for an SSAS cube is not surprisingly "default". In actuality, 9 properties make up the error configuration setting and each property has its own default value as shown in the second illustration below.
Each of the 9 properties controls slightly different processing results and several of the settings have the same potential values that can be selected. The properties include:
- CalculationError - Determines what occurs when an error is found in a calculation created on the Calculation tab.
- KeyDuplicate - Determines what occurs when duplicate keys are found in a dimension.
- KeyErrorAction - Determines what action is implemented when a KeyNotFound error occurs. The two options are ConvertToUnknown and DiscardRecord.
- KeyErrorLimit - Determines how many errors can be recorded before processing stops, -1 allows for unlimited errors.
- KeyErrorLogFile - Notes the location of the SSAS processing error log file, the SSAS service account must have access to the directory.
- KeyNotFound - Determines what occurs when a foreign key in the fact table does not have a matching primary key value in the related dimension table.
- NullKeyConvertedToUnknown - Determines what occurs when null values are converted to unknown members.
- NullKeyNotAllowed - Determines what occurs when null values are not allowed.
The option grid below shows the actions available for the CalculationError, KeyNotFound, KeyDuplicate, NullKeyNotAllowed, and NullKeyConvertedToUnknow:
The difficult thing with setting the error configurations is that they can be set in one or more of all the following places:
- Cube properties
- Dimension properties
- Partition properties
- During processing of any of the above
Furthermore, NULL processing methods always occur first before the error configurations are implemented and in some cases the Unknown Member settings in a dimension must be set in a specific way to interact with the error configuration properties.
SQL Server Analysis Services Error Processing Example
So what does all this mean; it means that you need to make sure you understand and use the proper setting when you process your cubes, partitions, and dimensions.
Let us look at a few examples; I am going to make changes to the PromotionKey field in the FactInternetSales fact table and the related primary key, PromotionKey in the DimPromotion table. To be able to show you the various examples, I did need to remove the constraint on the PromotionKey field in the fact table. I then updated the PromotionKey in the fact table to a value that I knew was not a valid primary key value in the Promotion Dimension source table. As illustrated below, I updated the value to 88888.
Now when the cube is processed the below attribute key not found error results.
One option, which is shown below, is to change the KeyNotFound to IgnoreError. This change means that any key not found errors will not be counted in the error count nor will they be logged. Also, based on the KeyErrorAction field which is set to ConvertToUnknown, the invalid PromotionKey value will be converted to Unknown.
The result of this setting is that the cube processes without reporting any errors as shown below. Of course, this result is a false positive, in that the cube did process fine; however, the offending data row was actually "quarantined" so to speak and the data is not included in the fact table measure values reported to the client application and report. The row is absent from the data, and we actually have no knowledge that a problem even existed.
We could create a similar situation processing wise while at least getting a report of the errors by changing two fields: 1) change the KeyNotFound property to ReportAndContinue and 2) Adjusting the KeyErrorLimit to a number below which we will accept errors. In this scenario, we are requesting that the error be reported (both via the processing window and the log, which we will discuss later) and also that processing will stop if we see more than 10 errors.
Now when the cube is processed, the processing still completes and the row is still removed from the measure data. However, we receive a notification that a key error does exist, as displayed next. An error message is displayed in the processing window, that specifically tells us what error occurred, i.e. the missing promotion value.
Next, by adding a path and file name to the KeyErrorLogFile property, the errors will get logged to a log file.
The below error log clearly shows the PromotionKey error while still processing the cube to completion. Even so, the invalid row is still removed from the cube results; using the cube browser a copy of the data results are displayed in the second screen print below and do not include the invalid key data row.
Up to this point, the offending row has been removed from the dataset returned to the client application. We could allow the row to show as an "Unknown" value by making two adjustments to the dimension that is causing the error, Promotion for our example. First, within the dimension properties, the UnknownMember property needs to be set to Visible. Second, the UnknownMemberName should be updated to a relevant value for your situation as this name will be what is displayed to the end user.
Now when the cube data is reviewed, the offending row is displayed with, in the below example, with a specific name of "NEED TO BE UPDATED IN NEXT RUN".
Although we concentrated on KeyNotFound property, similar error configuration processing methods could be followed with the KeyDuplicate, CalculationError, NullKeyConvertedToUnknown, and NullKeyNotAllowed error properties, and thus will not go over those in detail.
In addition to the cube error configuration property settings, these same setting can be adjusted both at the partition and dimension level, as shown in the next two figures, which can override the cube settings.
Furthermore, during the actual cube processing, we can adjust / override how errors are handled. First, the cube processing must be initiated and then the Change Settings button clicked as illustrated below.
Next, clicking on the Dimension key errors tab and then Use customer error configuration radio button, allows us to adjust the error setting at processing "run time."
In the above screen print example, I left the Number of errors to 0; thus a single error will now stop processing of the cube. As such, the cube quickly errors out with these "run time" settings; allowing no errors for this example.
Conclusion
Out of the box, SSAS has default settings for handling errors; these settings, which exist under the ErrorConfiguration property, allow for various outcomes to occur while processing a cube, partition, or dimension. Five of the properties control whether an error is reported and processing continues, or an error is reported and processing stops, or whether the error ignored and not reported.
Furthermore, NullProcessing of values is handled first during processing, as this defines how the NullKeyConvertedToUnknown and NullKeyNotAllowed errors are handled. When the number of allowable errors is increased to a number greater than zero, the cube will continue processing until the number of errors reported meets the new threshold level.
However, when the error method is set to report and continue or ignore, the offending rows will not be included in the measure values unless the UnknownMember properties are set for the related dimension. To further complicate the situation, the ErrorConfiguration settings can be set at the cube level, the partition level, the dimension level, or during the processing itself.
With all these scenarios, much care must be followed when using the non-default ErrorConfiguration settings.
Solution for the SSAS FileStore Error While Processing Dimensions
'The dreaded File system error “A FileStore error from WriteFile occurred.”'
File system error: A FileStore error from WriteFile occurred. Physical file: \\?\C:\Program Files\Microsoft SQL Server\MSAS10_50.MSSQLSERVER\OLAP\Temp\MSMDCacheRowset_1136_e48_dnogks.tmp. Logical file: . : The parameter is incorrect. .
(Sidenote: it was not the string limitation issue which, incidentally, is fixed in SQL Server 2012.)
What’s The Fix?
versions were different between Dev and Test. As it turns out, the Development SQL Server box had Service Pack 1 for SQL Server 2008 R2 installed whereas the Test box did not.get SP1 installed. Next time processed the cube, the issue was gone.
To download SP1: Download Center
Data collection for troubleshooting Analysis Services issues
The types of logs/data that we typically request when investigating Analysis Services issues.
Most of these logs (excluding dumps) are easily readable and you can use them for your own investigation.
MS support uses a number of tools (mps reports, Support Diagnostic Platform, pssdiag, sqldiag) for automated data collection.
Unfortunately these tools are not suitable for SSAS data collection yet. This will hopefully change in the near future.
Until then we are stuck with manual or semi-automated data collection.
The kind of data we need to analyze will obviously depend on the nature of the problem under investigation.
General data:
1. msmdsrv.ini
The configuration file for the SSAS instance “msmdsrv.ini” can be found in the “\config” folder of the SSAS instance.
Typically something like “C:\Program Files\Microsoft SQL Server\MSAS10_50.MSSQLSERVER\OLAP\Config”.
If you are unsure about the location, then have a look at the properties of the SSAS service under “services.msc”.
The “path to executable” field will have the config file folder as “-s” startup parameter:
f.i. “C:\Program Files\Microsoft SQL Server\MSAS10_50.MSSQLSERVER\OLAP\bin\msmdsrv.exe” -s “C:\Program Files\Microsoft SQL Server\MSAS10_50.MSSQLSERVER\OLAP\Config”
“msmdsrv.ini” contains the server properties in XML format. You should not edit it unless MS support asks you to do so. And even then do it very carefully.
The supported interface for changing instance properties is SQL Server Management Studio.
2. windows event logs
windows event logs. They have been around forever and you can access them via “Administrative Tools” –> “Event Viewer”.
SSAS error messages will appear in the application log. System problems (memory, disk space) will appear in system log.
For problem analysis we are interested in both, system + application event log.
Please save them in .txt or .csv format in order to ensure that event resolution happens on the source server.
3. system information / msinfo32
This log gives us valuable information about your host machine setup.
The number + type of CPUs, RAM size, page file size and lots of other hardware related information.
Please save as system information file (.nfo) or export to text file.
4. msmdsrv.log
Optional. The log file for the SSAS instance that can be found in \log folder of the instance (f.i. C:\Program Files\Microsoft SQL Server\MSAS10.MSSQLSERVER\OLAP\Log).
For exact location check instance properties in Management Studio or “msmdsrv.ini”.
Typically all the information in here should be available from application log as well.
But it does not hurt to double-check its content.
5. content of SSAS data folder
Optional. Often it is also quite useful to have a look at the sizes of SSAS data files.
An overview of SSAS data files in use can be created by execution the following commands from command prompt :
cd <SSAS data folder>
dir /s > datafiles.txt
data collection for special scenarios:
Depending on the problem area we will need to look at additional data.
Here’s a short list of the problem areas we typically observe.
A. setup issues
All setup related log files can be found in the “setup bootstrap” folder.
Usually : “C:\Program Files\Microsoft SQL Server\100\Setup Bootstrap\Log”
The summary log will show you the general outcome of the setup operation and point you to log files that give more detail about the setup issue (if there was any).
When contacting Microsoft support please zip up the log folder + provide it to us.
B. cube design issues
For cube design issues we will need to have a look at the meta data of your SSAS database.
The XMLA script can be generated in “Management Studio” by right-clicking the database icon -> script database as -> Create to -> File …
C. internal errors /crashes /exceptions /hangs
SSAS automatically creates mini dumps (.mdmp) when it runs into exceptions or certain (configurable) errors.
For a detailed description of settings see:
919711 How to configure SQL Server Analysis Services to generate memory dump files
http://support.microsoft.com/default.aspx?scid=kb;EN-US;919711
This diagnostic information is very valuable to us and is created in the \log folder of the SSAS instance (f.i. C:\Program Files\Microsoft SQL Server\MSAS10_50.MSSQLSERVER\OLAP\Log).
Please collect the mini dumps (SQLDmprxxxx.mdmp) and the associated log files (SQLDmprxxxx.log) that are associated with your issue (time stamp of files!).
The SQLDUMPER_ERRORLOG.log records all mini dumps that have been created and can be useful as well.
For hang issues we may ask you to create “hang dumps” manually.
You can do this by using the “sqldumper.exe” utility as described in “How to generate a full dump file that includes handle information manually” section of KB 919711.
Full dumps (as big as SSAS memory foot print) can be created via “sqldumper.exe PID 0 0x34:0x4 0 PathToDumpFile” command.
Mini dumps (typically 100 – 300 MB) can be collected via “sqldumper.exe PID 0 0x120 0 PathToDumpFile” command.
D. performance issues / hangs
Performance issues are best analyzed with a combination of SQL profiler traces and performance counter logs.
Sometimes we may also need to create mini dumps manually (hang dumps).
SQL Server profiler:
Under the “sql server” program group and “performance tools” subheading you will find the “sql server profiler” tool.
Start it , select File -> new trace and connect to “server type” = “Analysis Services” + your SSAS instance name.
On the trace properties page make sure the Standard trace template is selected. You should also enable file rollover with a maximum file size of 200 MB.
Hitting the run button will start the trace.
Hitting the red square button will stop it again.
Since SQL Server 2008 R2 Service Pack 1 we also have new “resource usage”, “MDX script” and “Lock” events that may prove useful for troubleshooting performance issues:
2458438 FIX: SQL Server 2008 R2 Analysis Services introduces new trace events to track resource usage and locks by using SQL Server Profiler
http://support.microsoft.com/default.aspx?scid=kb;EN-US;2458438
If you are seeing a hang and suspect deadlock issues, then don’t forget to include the deadlock event as well.
performance counter log:
The basics for performance counter log handling are described here: http://technet.microsoft.com/en-us/library/cc766404.aspx
For SSAS performance troubleshooting we need the OS counter groups
Memory (all counters)
Processor (allinstances, all counters)
Process (all instances, all counters) , sometimes we limit collection to msmdsrv and sqlservr process instances in order to reduce data volume
Logical disk (all instances, all counters)
Paging File (%usage)
In addition we need at least the following SSAS counter groups:
MSAS:Threads
MSAS:Memory
MSAS:Locks
In the collector set properties you should select a 200 MB file rollover as indicated below:
E. “Repro”
Typically this involves a SSAS database backup (.abf) and a MDX query or application sample code that triggers the problem.
If reprocessing of the data is required or ROLAP data are involved, then we will likely need a backup of the relational source data as well.
Additional Information:
1.Locking mechanisms are inherent in Sql server(OLAPLockTypes). Hence this feature is deprecated.By default if a user is trying to update or process the objects in SSAS .Other user will not be able to access that object until it becomes free. Explicit Rollback and Commit transactions provided by the server can help in granting both of the users or neither of them in updating or reading the cube simultaneously Query Log table availed in the Sql Server Analysis Server repository by default was helpful to determine the OLAP usage statistics.Sql Server allows this option via its properties where you should manually set a data source where you want to create query log table will be created.Structure of this table is fixed and you cannot alter them.
Binary values are stored in DataSet Column You can run trace files at the background initiated by Sql Profiler that helps to store recent information about the query run. However if you want to log every single query or activity then it’s better to set up a trace using.How to leverage the Querylog Table? Usage based optimization wizard is meant for this. Required details like begin and end dates are prompted, set to choose: Performance gain of 30%.Purpose of doing count estimates while partitioning the cube helps in adjusting the performance of the query as well as determining if the limitation has reached (Not more than 20 million rows). Key to choice of storage settings( MOLAP or ROLAP) is here:200708, set 30% Performance with MOLAP on Partition 200707 and so on.For rarely queried partitions, use ROLAP with slice value setting.Do apply Proactive caching for frequently accessed MOLAP Partitions- An enhancement feature.
2.While Processing Dimensions, beware of those which exceed 4 GB since an exception is thrown.Property for a dimension should be manually set to attributes rather than tables. settings required to increase the size of string store. Usually a common error that occur here is "File system error: A File Store error from Write File occurred.Physical file...". Solution is to change all the ProcessingGroup properties to ByAttribute.
3.AMO can be used in ASP.NET by making sure that the Account Name you add in your Application pool has OLAP Admin privileges. Once done you can even delegate the privileges to users using AMO. You can Change the account in Identity tab of your selected Application Pool's Properties of inetmgr.
4.Clearing Analysis services Process logs (mdb files or tempdb, incase if you have migrated the repository)
Right Click Analysis Services Properties , Logging - >clear the log - > Specify drive having enough space. Dbcc shrinkdatabase tempdb or dbcc shrinkfile(,’space in MB’).Alter database tempdb... command to specify the least size. Stop and restart the analysis service. But in fact the second step considerably increased my tempdb space.
5.Have you come across an error which is very common: "Class Not Registered", when you click browse cube. Solution is to install the OWC – Office Web Components.
Validation of Data by browsing dimensions and cubes consists of the following Steps:
Check if all the levels show data after migration.
Browse the Cube, Check every dimension against the calculated members, and drill down to the lowest level and match this against legacy.
Note: Pivoting should also show the similarity in data up to lower levels.
6.The attribute hierarchy relations will get grouped down at the lowest level attribute (key attribute). In the above case, every attribute must be re-linked manually to its immediate parent attribute.Recreating the dimensions are required incase of Attributes having Parent Child relationships
7.Have you come across an error: OLEDB or ODBC error, Login failed for user 'NT AUTHORITY\NetworkService'.;42000. This is because we are supposed to add/ create the above account for the Sql Server in which your data source reside. Do apply sysadmin privlege to it if you are using AS.
8.You might get an error - "Server Option cancelled.. check the driver..." , Solution is to check if IIS, Network DTC, Network COM+ in Windows Installation Componenents - > Application Server has been checked. and the respective Services are running in Services snap-in.
9."Error in OLAP Storage Engine: Invalid Key Error", Solution is to Check for the Data Quality issues and While processing if you require to ignore these errors , then Change settings under Dimension key errors tab to ignore errors.
"Drill through failed. Error in the OLAP Engine .." . Solution : Under Analysis Services -> Advanced Properties -> Set Value of OLAP\Process\ROLAPProcessingEffort = count(Fact table ) + probable growth of rows. Provided the fact Dimensions have been added for the drill through with ROLAP Storage set.
10."Connection timed out .." Solution: Under Analysis Server -> Advanced Properties -> set Value of ConnectionTimeout= 0 and ExceedConnectionTimeout =0.