Thursday, September 19, 2013

Slow StatMan query issue with #SP2010 farm

2 days ago I've emailed the mail below, unfortunately I haven't heard anything and I haven't managed to resolve this issue yet. That's why I decided to blog about it, so I can also keep you posted on the progress !

Hi Help team,

I'm Jeroen (Dutch) all the way from South Africa, running into a Microsoft SQL Server 2008 R2 (SP2) - 10.50.4279.0 (X64)   Mar 26 2013 17:33:13   Copyright (c) Microsoft Corporation  Enterprise Edition (64-bit) on Windows NT 6.1 <X64> (Build 7601: Service Pack 1) (Hypervisor) challenge.

This server is also running SharePoint 2010 14.0.6117.5002 (Feb 2012 CU); The problem is that 'suddenly' (used to work without this problem), when the first person uploads a new document to a SharePoint document library, this upload takes forever.
Thanks to your free 'Triage' tool, I managed to pinpoint the query that is taking forever to finish:

EXEC master.dbo.sp_whoisactive
@get_plans=1,@get_transaction_info =1

results in:
SELECT StatMan([SC0], [LC0]) FROM (SELECT TOP 100 PERCENT CONVERT([varbinary](200), SUBSTRING ([Content], 1, 100)++substring([Content], case when LEN([Content])<=200 then 101 else LEN([Content])-99 end, 100)) AS [SC0], datalength([Content]) AS [LC0] FROM [dbo].[AllDocStreams] WITH (READUNCOMMITTED)  ORDER BY [SC0] ) AS _MS_UPDSTATS_TBL

This seems to be related with the statistics.. Probably the insert query for the document triggers the query optimizer to create statistics.. which causes a lot of disk i/o and takes more than 20 minutes to complete...
The auto_create_statistics & auto_update_statics settings are FALSE / off on the content-database.

I have manually created the statistics (not trusting the SharePoint timer job that supposed to do that but often fails, as explained here:

I used this query to create the statistics:
EXECUTE sp_msforeachdb
'USE [?];
USE WSS_Content_DiscoverPortal 
IF DB_NAME() NOT IN(''master'',''msdb'',''tempdb'',''model'')
          print ''updating statistics in database  ==> '' + db_name()
          if exists (select 1 from sys.objects where name = ''proc_updatestatistics'')
                  print ''updating statistics via proc_updatestatistics''
                  exec proc_updatestatistics
                  print ''updating statistics via sp_updatestats''
                  exec sp_updatestats

But unfortunately, that doesn't help.. It still hangs on the same "SELECT StatMan…." query

I have also looked at the indexes in the database / AllDocStreams table (by the way, the AllDocStreams table seems to hold all BLOB data for the documents in a SharePoint 2010 document library).
SELECT a.index_id, name, avg_fragmentation_in_percent
FROM sys.dm_db_index_physical_stats (DB_ID(), OBJECT_ID(N'dbo.AllDocStreams'),
    JOIN sys.indexes AS b ON a.object_id = b.object_id AND a.index_id = b.index_id;

ALTER INDEX AllDocStreams_CI ON dbo.AllDocStreams

ALTER INDEX AllDocStreams_RbsId ON dbo.AllDocStreams

To rebuild the indexes, resulting in 0% fragmentation.

I found this blogpost that seems to describe the same type of issue.. With no solution described..

And also these posts..
SELECT query is timing out due to SELECT STATMAN process
-->this is exactly it, but no answer..

--> no answer..

Does the BrentO help-team have any other pointers for me to look at ??
It's massively appreciated and I think maybe even challenging enough to take a look at ?

Thanks very much in advance,


Update: 14 October 2013
After posting this issue here:

I got some nice feedback and started re-checking the settings on the SharePoint content database.  Somehow (probably during early testing and trying to resolve this issue) the Database Setting "Auto Create Statistics) was turned on... This is not best practice for the SharePoint content database (SharePoint's timer job is taking care of creating statistics). After turning off this setting (resetting it to default setting), the issue seems to be resolved. Will keep a close monitor on this for the rest of this week and let you know next week if this resolved this painful issue !

Brent Ozar - How To Prove Your SQL Server Needs More Memory

I Really enjoyed watching this short 30 minute webcast from Brent Ozar that shows you how to check if your SQL Server needs more memory. Of course the relationship between SQL & SharePoint [xxxx] is clear and will directly impact the performance of any SharePoint implementation !

Thanks so much for sharing this info Brent !

The link to the webcast:

My personal notes from this webcast:
(I've put in the time frames of the video when Brent talks about that specific performance counter).
64 GB of memory costs less than your laptop.
  • Create a business case to rectify the purchase of more memory for your SQL
  • Less than one weekday of your salary.

Cost of your time tuning + changing VS Cost to upgrade RAM to the next level

Memory - Available Mbytes

SQLServer: Buffer Manager - Page life expectancy
SQL keeps the grapes, but has to start making wine every time from scratch.
How long can SQL keep the page-date in memory (cache), before turning to disk to get the data. (Measured in seconds). The higher this value, the better (how long can the data be kept in-memory).  Some say, 300 seconds is absolute lowest.
Even if you are above 300..if the cache get's cleared very often you still have a problem!  (and putting more memory in won't solve this problem).
Also see "Buffer pool questions" for details.
SQLServer: Memory Manager - Memory grants pending
How many queries are waiting for memory before they can start. Should never be above 0. If above 0 , you should just add memory !
SQLServer: Memory Manager - Target Server Memory
Target = size of the barrel
SQLServer: Memory Manager  Total Server Memory
Total = how full is it right now
SQLServer: SQL Statistics - Batch Requests / sec
How busy is my sql server due to incoming queries.
SQLServer: SQL Statistics - Compilations / sec
(related to above) Building the execution plans #sec. There should already be execution plan ready, which lowers the CPU performance.
If SQL Server is forced to compile >10% of queries [because it can't cache them] then we need more memory.

Avoiding buying RAM
  • Parameterize queries
  • Keep your ORM up to date (Nhibernate, LINQ, EF, etc)
    • Keep these tools up to date ! (developers)
  • Transition to stored procedures
    • Don't keep these in the cache !
    • Only change this is this is causing the problem !

Buffer pool questions
  • What part of our database is "active" ?
    • Percentage ?
    • Cache the active part; you have to figure this out
      • Look at what data-pages are cached in memory ?
      • Which databases are cached in memory ?
        • Which tables ?
  • What data pages are being cached in memory ?
  • What queries are constantly running, thereby forcing those pages to be cached
  • Analyse using: Sys.dm_exec_query_stats (
    • Pulls out the top 10 / 20 queries & how much resources they use
    • Make sure to order by TotalReads sec
  • Analyse which pages are in memory (
    • Which database & tables
    • This is a one-time snapshot !
  • It make a long time to run these queries.. Run them during a course of time.

From easy to hard (aka, cheap to expensive)
  • Slow, and less than 64 GB ? Learn how to explain business costs: your time and risks vs $500 RAM.
  • Memory Grants Pending >0 ? Queries can't start,  buy more memory
  • Compiles / sec over 10% of Batch Requests / Sec ?
    • SQL Server May not be able to cache plans.

Thursday, September 12, 2013

Script to download documents from SharePoint that match a certain view

Today I got a request to put some files from SharePoint on a USB stick.
My boss is currently in some jungle and although they have (satellite) SharePoint connection which is fine to conduct searches and "define which files they need", it's almost impossible to download 11 GB on.

That's why he told me which document-sets he wanted on a USB stick, for HIS boss to take with .

Our situation is that we have a few document libraries with about 20.000 files in them. Using views and document-sets to make sense of it all. Who needs folders (which are just a piece of metadata to group some files together… right )?!
That's all cewl.. But when you want to export a few document-sets and a few hundred other files grouped together via a value in a column which obviously doesn't exist when you click the "open in explorer" button.

After some thinking I decided to see if I can export the files to the file-system via PowerShell.
Pretty soon I found this article:

This article from Jack shows the basics for downloading items from a SP list via PowerShell. My only problem was that I don't need ALL files, just certain files.
So I created a view in SharePoint that matches my criteria. Basically a view per destination folder so that it makes a bit sense after exporting the files. (Folders.. You know.. The old-school stuff before SharePoint libraries !).

Quite soon I found out about the SPList Class & .GetItems method (sample script I had uses SpList.Items to iterate through each item in the list.
This method doesn't support filtering on a view or query, so I changed it to the SpList.GetItems  method.
You can provide a query or … VIEW to this method ! YAY…
Unfortunately I hit a snag.. In this part:
$filelocation = $item["ows_ServerUrl"]
This property "ows_ServerUrl" in not in the set of properties returned by the .GetItems method; so you can't use it to provide it to the download component.

After some research I came across this post:

And there I saw they provided the file (item) location using the following command:
 SPFile file = item.File;

So, I tried the same method in my script and … voilla ! It was working !!
Downloading the items that match the view !

Of course you can just change the filtering in the view, matching whatever files you wish to export from SharePoint and run the PowerShell… put them in a folder that matches the view and you're done !

I didn't come across any other example on how to do this, so I thought let's contribute and make a blog about it.

The final script:

#Script to download documents from SharePoint that match a certain view

#site name

#view name

#destination location

#view-name to filter the export
$spview = $list.views[$viewname]

$sourceweb = get-spweb $sourceurl
$sourcelist = "BRGM"
$list = $sourceweb.lists[$sourcelist];
$count = $list.items.Count

write-host "iterating $count items in list"
$listItems = $list.GetItems($spview)

write-host $listItems.Count
Write-host "Items in view"

foreach ($item in $listItems)

Write-host "Processing:"
Write-host $item.File.Name

if ($item.FileSystemObjectType -eq "File")
$filename = $item.File.Name

$spFile = $item.File;
[string]$targetname = $filename
[string]$target = $destinationfolder + "\" + $targetname


Thursday, September 05, 2013

Latest .NET security update breaks SharePoint 2010 datasheet view

Latest Microsoft Update Patches breaks the Datasheet View on #SP2010 Confirmed on 2 clients now.. (so it was working but after running windows update, it was not working anymore). It now returns "The selected cells are read-only" when trying to edit a field in the Datasheet view. (Any field, all which were previously editable).

DataSheet view gives "column is read-only" after one of these updates is installed on win7 + office 2010 client:
KB2840628v2 (Security Update for Microsoft .NET framework 4 Client Profile)
KB2835393 (Security Update for Microsoft .NET framework 4 Client Profile)
KB2804576 (Security Update for Microsoft .NET framework 4 Client Profile)
KB2789642 (Security Update for Microsoft .NET framework 4 Client Profile)
KB2742595 (Security Update for Microsoft .NET framework 4 Client Profile)
KB2737019 (Security Update for Microsoft .NET framework 4 Client Profile)
KB2736428 (Security Update for Microsoft .NET framework 4 Client Profile)
KB2729449 (Security Update for Microsoft .NET framework 4 Client Profile)
KB2656351 (Security Update for Microsoft .NET framework 4 Client Profile)
KB2604121 (Security Update for Microsoft .NET framework 4 Client Profile)
KB2600217 (Update for Microsoft .NET Framework 4 client Profile)
KB2533523 (Update for Microsoft .NET Framework 4 client Profile)
KB2468871 (Update for Microsoft .NET Framework 4 client Profile)
KB2742595 (Security Update for Microsoft .NET Framework 4 Extended)

Please let me know if you run into similar problems or how to notify Microsoft about this.  I did had to install the 2007 Office System Drives: Data Connectivity components to regain the Datasheet functionality in the first place:

(re-install of this tool does not regain functionality)