So,
I'm helping out a small business by creating an online exposure for them via O365. Just showing the products, with a product-code in a category and an image.
There are about 3000 products so showing them all at once sounds like a bad idea.. Let's just group them on the category, users can open a category and see all product with a little product image (thumbnail) and in the future select them to get a quote for them.
Of course, all of this happens in the "public" = "anonymous" part of the 0365 site.
I'm sorry .. but that's not going to work !
Microsoft support (SRX1219692554ID) suggests you better use an alternative view without the grouping functionality. (That's the final "work around" they gave me after some extensive testing).
As it turns out, the anonymous users don't have sufficient permissions to access this information, resulting in a "working on it" status as soon as these anonymous users hit the + sign before a group. This status won't change.. no matter how long you wait for it; in the background O365 is trying to redirect these users to the login page which in not going to work for these type of users.
Authenticated users see the products in a category:
Anonymous users just see "Working on it..."
The "work around" is of course not helping at-all, there are no other views that have this functionality.. I'm not even daring to think to create a Access APP (view) that can host this functionality for anonymous users...
Big snag !! Maybe some JavaScript + XSLT to the rescue.. I'll post a real work-around as soon as I find one !
Saturday, October 19, 2013
Wednesday, October 02, 2013
3rd Party Apps that leave their Sh#!T behind...(or crash your farm in the removal process)
You know the situation, lots of time pressure on the project, the client needs a little [treeview] [search optimizer] [birthday calendar] [pdf / cad / xxx viewer] [etc] (select or choose your own) addition to the SharePoint implementation.
There is no budget, because .. why should we by this ??! Doesn't it come with SharePoint ? We'll think about it when we see it working...
And you know you should use the dev / test environment to run these trials on .. but hey... sometimes there is only 1 environment...
STOP... Don't DO IT !!
Chances are that the 3rd party tool you are installing is NEVER going to leave the farm / web app / site collection, or you going to spend ALLOT of time cleaning up after you try to remove it.
Maybe you don't care what's left behind.. until you see the warnings appear in the health-analyzer.. Maybe you don't look in the health analyzer ? You will... once you install a CU or want to upgrade to SharePoint 2013.. Which won't work.. because there are some missing components ! ALERT ALERT ALERT !!! Of course you did a trial upgrade .. right !?!
The above scenarios I consider "lucky'", in one occasion I saw the whole farm die when I uninstalled a 3rd party product; giving a nice correlation ID message to all visitors until YOU the admin who broke everything (euh.. I clicked uninstall... wtf ?!?! ) will FIX IT ! (For free of course.. )
Things that can get left behind (also after the neat uninstall says "finished":
- Event receivers (in every single site (collection) !
http://speventreceiverman.codeplex.com/
- Web Parts
(web part gallery)
http://etienne-sharepoint.blogspot.com/2011/10/solving-missingwebpart-errors-from.html
- Assemblies
http://etienne-sharepoint.blogspot.com/2011/10/solving-missingassembly-errors-from.html?showComment=1380725089569#c3924490290473490722
- Setup Files
http://etienne-sharepoint.blogspot.com/2011/10/solving-missingsetupfile-errors-from.html
-Features
http://featureadmin.codeplex.com/
There is no budget, because .. why should we by this ??! Doesn't it come with SharePoint ? We'll think about it when we see it working...
And you know you should use the dev / test environment to run these trials on .. but hey... sometimes there is only 1 environment...
STOP... Don't DO IT !!
Chances are that the 3rd party tool you are installing is NEVER going to leave the farm / web app / site collection, or you going to spend ALLOT of time cleaning up after you try to remove it.
Maybe you don't care what's left behind.. until you see the warnings appear in the health-analyzer.. Maybe you don't look in the health analyzer ? You will... once you install a CU or want to upgrade to SharePoint 2013.. Which won't work.. because there are some missing components ! ALERT ALERT ALERT !!! Of course you did a trial upgrade .. right !?!
The above scenarios I consider "lucky'", in one occasion I saw the whole farm die when I uninstalled a 3rd party product; giving a nice correlation ID message to all visitors until YOU the admin who broke everything (euh.. I clicked uninstall... wtf ?!?! ) will FIX IT ! (For free of course.. )
Things that can get left behind (also after the neat uninstall says "finished":
- Event receivers (in every single site (collection) !
http://speventreceiverman.codeplex.com/
- Web Parts
(web part gallery)
http://etienne-sharepoint.blogspot.com/2011/10/solving-missingwebpart-errors-from.html
- Assemblies
http://etienne-sharepoint.blogspot.com/2011/10/solving-missingassembly-errors-from.html?showComment=1380725089569#c3924490290473490722
- Setup Files
http://etienne-sharepoint.blogspot.com/2011/10/solving-missingsetupfile-errors-from.html
-Features
http://featureadmin.codeplex.com/
Add DB server IP to localhost when using SQL Alias
So, when you implement the "best practice" of using SQL Server ALIAS names (to be flexible at all times of where your SharePoint databases will be hosted), apparently it's also a best practice to add the IP & ALIAS name to (all) your WFE & APP's that are using this name.
This came to my attention when I started receiving ERROR logs in the windows APP log of a WFE:
Log Name: Application
Eh.. duh.. of course the database server is available.. I personally know the DBA ! :P :P
This came to my attention when I started receiving ERROR logs in the windows APP log of a WFE:
Log Name: Application
Source:
Microsoft-SharePoint Products-SharePoint Foundation
Date: 2013/10/02
07:24:00 AM
Event ID: 6398
Task Category: Timer
Level: Critical
Keywords:
User: **
Computer: **
Description:
The Execute method
of job definition
Microsoft.SharePoint.Diagnostics.SPDatabaseServerDiagnosticsPerformanceCounterProvider
(ID 5bce4422-820a-465f-aee4-5ce0f31640ce) threw an exception. More information
is included below.
SharePointDbServer:The
network path was not found.
;
Please ensure that
the database server is available
Eh.. duh.. of course the database server is available.. I personally know the DBA ! :P :P
(Yeah, he's one of my personalities)...
Ok; so .. apparently when you use the SPDiag toolkit from MS, this toolkit starts monitoring stuff and will trigger above ERROR in your eventlog, as being described here:
Thank you so much Mohammed Al-Basri for pointing this out :)
I added the HOME ip (in this case it's a single server install) 127.0.0.1 + DB ALIAS name and voila, another problemo solved !
Thursday, September 19, 2013
Slow StatMan query issue with #SP2010 farm
2 days ago I've emailed help@brentozar.com the mail below, unfortunately I haven't heard anything and I haven't managed to resolve this issue yet. That's why I decided to blog about it, so I can also keep you posted on the progress !
Hi Help team,
I'm Jeroen (Dutch)
all the way from South Africa, running into a Microsoft SQL Server 2008 R2 (SP2) -
10.50.4279.0 (X64) Mar 26 2013
17:33:13 Copyright (c) Microsoft
Corporation Enterprise Edition (64-bit)
on Windows NT 6.1 <X64> (Build 7601: Service Pack 1) (Hypervisor)
challenge.
This server is also
running SharePoint 2010 14.0.6117.5002 (Feb 2012 CU); The problem is that
'suddenly' (used to work without this problem), when the first person uploads a
new document to a SharePoint document library, this upload takes forever.
Thanks to your free
'Triage' tool, I managed to pinpoint the query that is taking forever to
finish:
EXEC
master.dbo.sp_whoisactive
@get_plans=1,@get_transaction_info
=1
results in:
SELECT
StatMan([SC0], [LC0]) FROM (SELECT TOP 100 PERCENT CONVERT([varbinary](200),
SUBSTRING ([Content], 1, 100)++substring([Content], case when
LEN([Content])<=200 then 101 else LEN([Content])-99 end, 100)) AS [SC0],
datalength([Content]) AS [LC0] FROM [dbo].[AllDocStreams] WITH
(READUNCOMMITTED) ORDER BY [SC0] ) AS
_MS_UPDSTATS_TBL
This seems to be
related with the statistics.. Probably the insert query for the document
triggers the query optimizer to create statistics.. which causes a lot of disk
i/o and takes more than 20 minutes to complete...
The
auto_create_statistics & auto_update_statics settings are FALSE / off on
the content-database.
I have manually created the statistics (not trusting the SharePoint timer job that supposed to do that but often fails, as explained here:
I used this query to
create the statistics:
EXECUTE
sp_msforeachdb
'USE [?];
USE
WSS_Content_DiscoverPortal
IF DB_NAME() NOT
IN(''master'',''msdb'',''tempdb'',''model'')
begin
print ''updating statistics in
database ==> '' + db_name()
if exists (select 1 from sys.objects
where name = ''proc_updatestatistics'')
begin
print ''updating statistics
via proc_updatestatistics''
exec proc_updatestatistics
end
else
begin
print ''updating statistics
via sp_updatestats''
exec sp_updatestats
end
end'
But unfortunately,
that doesn't help.. It still hangs on the same "SELECT StatMan…."
query
I have also looked
at the indexes in the database / AllDocStreams table (by the way, the
AllDocStreams table seems to hold all BLOB data for the documents in a
SharePoint 2010 document library).
SELECT a.index_id,
name, avg_fragmentation_in_percent
FROM
sys.dm_db_index_physical_stats (DB_ID(), OBJECT_ID(N'dbo.AllDocStreams'),
NULL, NULL, NULL) AS a
JOIN sys.indexes AS b ON a.object_id =
b.object_id AND a.index_id = b.index_id;
GO
ALTER INDEX
AllDocStreams_CI ON dbo.AllDocStreams
REBUILD;
GO
--1%
ALTER INDEX
AllDocStreams_RbsId ON dbo.AllDocStreams
REBUILD;
GO
--3.5%
To rebuild the
indexes, resulting in 0% fragmentation.
I found this
blogpost that seems to describe the same type of issue.. With no solution
described..
And also these
posts..
SELECT query is
timing out due to SELECT STATMAN process
-->this is
exactly it, but no answer..
BINGO: STATSMAN
QUERY – KNOWN ISSUE WITH UPDATE STATISTICS
--> no answer..
Does the BrentO
help-team have any other pointers for me to look at ??
It's massively
appreciated and I think maybe even challenging enough to take a look at ?
Thanks very much in advance,
Jeroen
Update: 14 October 2013
After posting this issue here: http://dba.stackexchange.com/questions/51303/extremely-slow-statman-query-issue-with-sp2010-farm?noredirect=1#comment90988_51303
I got some nice feedback and started re-checking the settings on the SharePoint content database. Somehow (probably during early testing and trying to resolve this issue) the Database Setting "Auto Create Statistics) was turned on... This is not best practice for the SharePoint content database (SharePoint's timer job is taking care of creating statistics). After turning off this setting (resetting it to default setting), the issue seems to be resolved. Will keep a close monitor on this for the rest of this week and let you know next week if this resolved this painful issue !
Update: 14 October 2013
After posting this issue here: http://dba.stackexchange.com/questions/51303/extremely-slow-statman-query-issue-with-sp2010-farm?noredirect=1#comment90988_51303
I got some nice feedback and started re-checking the settings on the SharePoint content database. Somehow (probably during early testing and trying to resolve this issue) the Database Setting "Auto Create Statistics) was turned on... This is not best practice for the SharePoint content database (SharePoint's timer job is taking care of creating statistics). After turning off this setting (resetting it to default setting), the issue seems to be resolved. Will keep a close monitor on this for the rest of this week and let you know next week if this resolved this painful issue !
Labels:
Performance,
Query,
SharePoint 2010,
Slow,
SQL 2008R2,
StatMan
Brent Ozar - How To Prove Your SQL Server Needs More Memory
I Really enjoyed watching this short 30 minute webcast from Brent Ozar that shows you how to check if your SQL Server needs more memory. Of course the relationship between SQL & SharePoint [xxxx] is clear and will directly impact the performance of any SharePoint implementation !
Thanks so much for sharing this info Brent !
The link to the webcast:
http://www.brentozar.com/archive/2013/09/how-to-prove-your-sql-server-needs-more-memory-video/
My personal notes from this webcast:
(I've put in the time frames of the video when Brent talks about that specific performance counter).
Thanks so much for sharing this info Brent !
The link to the webcast:
http://www.brentozar.com/archive/2013/09/how-to-prove-your-sql-server-needs-more-memory-video/
My personal notes from this webcast:
(I've put in the time frames of the video when Brent talks about that specific performance counter).
64 GB of memory
costs less than your laptop.
- Create a business case to rectify the purchase of more memory for your SQL
- Less than one weekday of your salary.
Cost of your time tuning + changing VS Cost to upgrade
RAM to the next level
Memory - Available
Mbytes
|
|
SQLServer: Buffer
Manager - Page life expectancy
(18:41) |
SQL keeps the
grapes, but has to start making wine every time from scratch.
How long can SQL
keep the page-date in memory (cache), before turning to disk to get the data.
(Measured in seconds). The higher this value, the better (how long can the
data be kept in-memory). Some say, 300
seconds is absolute lowest.
Even if you are
above 300..if the cache get's cleared very often you still have a
problem! (and putting more memory in
won't solve this problem).
Also see
"Buffer pool questions" for details.
|
SQLServer: Memory
Manager - Memory grants pending
(13:10) |
How many queries
are waiting for memory before they can start. Should never be above 0. If
above 0 , you should just add memory !
|
SQLServer: Memory
Manager - Target Server Memory
|
Target = size of
the barrel
|
SQLServer: Memory
Manager Total Server Memory
|
Total = how full
is it right now
|
SQLServer: SQL
Statistics - Batch Requests / sec
(14:27)
|
How busy is my sql
server due to incoming queries.
|
SQLServer: SQL
Statistics - Compilations / sec
(14:27) |
(related to above)
Building the execution plans #sec. There should already be execution plan
ready, which lowers the CPU performance.
If SQL Server is forced to compile >10% of queries [because it can't cache them] then we need more memory. |
Avoiding buying RAM
- Parameterize queries
- Keep your ORM up to date (Nhibernate, LINQ, EF, etc)
- Keep these tools up to date ! (developers)
- Transition to stored procedures
- Consider OPTIMIZE FOR AD HOC
- Don't keep these in the cache !
- Only change this is this is causing the problem !
Buffer pool
questions
- What part of our database is "active" ?
- Percentage ?
- Cache the active part; you have to figure this out
- Look at what data-pages are cached in memory ?
- Which databases are cached in memory ?
- Which tables ?
- What data pages are being cached in memory ?
- What queries are constantly running, thereby forcing those pages to be cached
- Analyse using: Sys.dm_exec_query_stats (BrentOzar.com/go/plans)
- Pulls out the top 10 / 20 queries & how much resources they use
- Make sure to order by TotalReads sec
- Analyse which pages are in memory (brentozar.com/go/pool)
- Which database & tables
- This is a one-time snapshot !
- It make a long time to run these queries.. Run them during a course of time.
From easy to hard
(aka, cheap to expensive)
- Slow, and less than 64 GB ? Learn how to explain business costs: your time and risks vs $500 RAM.
- Memory Grants Pending >0 ? Queries can't start, buy more memory
- Compiles / sec over 10% of Batch Requests / Sec ?
- SQL Server May not be able to cache plans.
- Buffer Page Life Expectancy > 300 ? You still may need more memory, but start with tuning indexes & queries.
- http://BrentOzar.com/blitzindex - sp_BlitzIndex
- http://BrentOzar.com/go/indextuning
Thursday, September 12, 2013
Script to download documents from SharePoint that match a certain view
Today I got a
request to put some files from SharePoint on a USB stick.
My boss is currently in some jungle and although they have (satellite) SharePoint connection which is fine to conduct searches and "define which files they need", it's almost impossible to download 11 GB on.
My boss is currently in some jungle and although they have (satellite) SharePoint connection which is fine to conduct searches and "define which files they need", it's almost impossible to download 11 GB on.
That's why he told
me which document-sets he wanted on a USB stick, for HIS boss to take with .
Our situation is
that we have a few document libraries with about 20.000 files in them. Using
views and document-sets to make sense of it all. Who needs folders (which are
just a piece of metadata to group some files together… right )?!
That's all cewl..
But when you want to export a few document-sets and a few hundred other files
grouped together via a value in a column which obviously doesn't exist when you
click the "open in explorer" button.
After some thinking
I decided to see if I can export the files to the file-system via PowerShell.
Pretty soon I found
this article:
This article from
Jack shows the basics for downloading items from a SP list via PowerShell. My
only problem was that I don't need ALL files, just certain files.
So I created a view in SharePoint that matches my criteria. Basically a view per destination folder so that it makes a bit sense after exporting the files. (Folders.. You know.. The old-school stuff before SharePoint libraries !).
So I created a view in SharePoint that matches my criteria. Basically a view per destination folder so that it makes a bit sense after exporting the files. (Folders.. You know.. The old-school stuff before SharePoint libraries !).
Quite soon I found
out about the SPList Class & .GetItems method (sample script I had uses
SpList.Items to iterate through each item in the list.
This method doesn't
support filtering on a view or query, so I changed it to the
SpList.GetItems method.
You can provide a
query or … VIEW to this method ! YAY…
Unfortunately I hit
a snag.. In this part:
$filelocation =
$item["ows_ServerUrl"]
This property
"ows_ServerUrl" in not in the set of properties returned by the
.GetItems method; so you can't use it to provide it to the download component.
After some research
I came across this post:
And there I saw they
provided the file (item) location using the following command:
SPFile file = item.File;
So, I tried the same
method in my script and … voilla ! It was working !!
Downloading the
items that match the view !
Of course you can just change the filtering in the view, matching whatever files you wish to export from SharePoint and run the PowerShell… put them in a folder that matches the view and you're done !
I didn't come across any other example on how to do this, so I thought let's contribute and make a blog about it.
The final script:
#Script to download
documents from SharePoint that match a certain view
#site name
$sourceurl="http://**/sites/newbusiness"
#view name
$viewname="tempj"
#destination
location
$destinationfolder="e:\trial"
#view-name to filter
the export
$spview =
$list.views[$viewname]
$sourceweb =
get-spweb $sourceurl
$sourcelist =
"BRGM"
$list =
$sourceweb.lists[$sourcelist];
$count =
$list.items.Count
write-host
"iterating $count items in list"
$listItems =
$list.GetItems($spview)
write-host $listItems.Count
Write-host
"Items in view"
foreach ($item in
$listItems)
{
Write-host
"Processing:"
Write-host
$item.File.Name
if
($item.FileSystemObjectType -eq "File")
{
$filename
= $item.File.Name
#downloadfile
$spFile
= $item.File;
[string]$targetname
= $filename
[string]$target
= $destinationfolder + "\" + $targetname
[System.IO.File]::WriteAllBytes($target,$spFile.OpenBinary())
}
}
Thursday, September 05, 2013
Latest .NET security update breaks SharePoint 2010 datasheet view
Latest Microsoft
Update Patches breaks the Datasheet View
on #SP2010 Confirmed on 2 clients now.. (so it was working but after running
windows update, it was not working anymore). It now returns "The selected
cells are read-only" when trying to edit a field in the Datasheet view. (Any
field, all which were previously editable).
DataSheet view gives
"column is read-only" after one of these updates is installed on win7
+ office 2010 client:
KB2840628v2
(Security Update for Microsoft .NET framework 4 Client Profile)
KB2835393 (Security
Update for Microsoft .NET framework 4 Client Profile)
KB2804576 (Security
Update for Microsoft .NET framework 4 Client Profile)
KB2789642 (Security
Update for Microsoft .NET framework 4 Client Profile)
KB2742595 (Security
Update for Microsoft .NET framework 4 Client Profile)
KB2737019 (Security
Update for Microsoft .NET framework 4 Client Profile)
KB2736428 (Security
Update for Microsoft .NET framework 4 Client Profile)
KB2729449 (Security
Update for Microsoft .NET framework 4 Client Profile)
KB2656351 (Security
Update for Microsoft .NET framework 4 Client Profile)
KB2604121 (Security
Update for Microsoft .NET framework 4 Client Profile)
KB2600217 (Update
for Microsoft .NET Framework 4 client Profile)
KB2533523 (Update
for Microsoft .NET Framework 4 client Profile)
KB2468871 (Update
for Microsoft .NET Framework 4 client Profile)
KB2742595 (Security
Update for Microsoft .NET Framework 4 Extended)
Please let me know
if you run into similar problems or how to notify Microsoft about this. I did had to install the 2007 Office System
Drives: Data Connectivity components to regain the Datasheet functionality in
the first place:
(re-install of this
tool does not regain functionality)
#SharePoint2010
#SP2010
#Datasheetview
#Office2010
Tuesday, July 30, 2013
SP2010 Datasheet view not working (but different)
Ok, running into a bit of trouble here..
We 'discovered' we actually need a publication date piece of metadata. No worries, updating the Content type hub with the additional metadata, re-publish the content type, activate some timer jobs, wait for it... yeah ! It's been pushed down to all site collections and subscribing farms :)
Wonderful.
Now we just need to populate a few thousand documents with the publication date.
Ok, .. let's create a view that filters all files, removes all folder structure (annoying folders anyway!) and filter with [year] in the document name.
Alright ! Let's use the free batch-editor I got from somewhere, check all files out (100 at the time), put in the date, check them all in and ... wait for it ... damn.. not working !
Probably a bug in the batch-edit tool.
Ok, let's try datasheet view then.. because then, you can just drag the the date down for all items. Datasheet view doesn't work for all kinds of 64 bit reasons. Google is full of potential answers.. but just don't have time to fix this right now / install all kinds of Office stuff / re-install office into 64 bit etc etc etc etc.
So... let's fire-up the win8 machine inside an Oracle Virtual Machine (free).
Connect to the test-library, datasheet view.. works ! yay ! Ok, connect to production library.. doesn't work ! Nay !!!
Wtf is going on ... ???! Same OS, same browser, same farm.. different libraries.. different views .
Googling this problem is a challenge, it's a case of .. you've got a certain problem which results in certain key-words (datasheet view + share point 2010 + failure) but you're actually NOT having the problem 90% of the people with these keywords have....I just hate it when this happens .. but also like it, because it's a challenge :)
So.. after some killing of search results, come across this one: http://social.msdn.microsoft.com/Forums/sharepoint/en-US/59b06220-3016-4c3f-819c-adb0fa16f7f0/issues-with-the-custom-filter-in-datasheet-view
Where dlanders1 shares his finding around the datasheet view and some influence on filtering ! (Thank You!)
So, I adjust the filter in the view, try datasheet again, works :)
Conclusion:
If you want to use datasheet view and you get the error: The Standard View of your list is being displayed because your browser does not support running ActiveX controls.
Don't pay to much attention to the ActiveX controls part of this error... You probably just need to make a simpler filter in the view !
Because there is so much irrelevant information out there, I'd thought to create a blog about this.. just add a possible relevant drop to the ocean!
We 'discovered' we actually need a publication date piece of metadata. No worries, updating the Content type hub with the additional metadata, re-publish the content type, activate some timer jobs, wait for it... yeah ! It's been pushed down to all site collections and subscribing farms :)
Wonderful.
Now we just need to populate a few thousand documents with the publication date.
Ok, .. let's create a view that filters all files, removes all folder structure (annoying folders anyway!) and filter with [year] in the document name.
Alright ! Let's use the free batch-editor I got from somewhere, check all files out (100 at the time), put in the date, check them all in and ... wait for it ... damn.. not working !
Probably a bug in the batch-edit tool.
Ok, let's try datasheet view then.. because then, you can just drag the the date down for all items. Datasheet view doesn't work for all kinds of 64 bit reasons. Google is full of potential answers.. but just don't have time to fix this right now / install all kinds of Office stuff / re-install office into 64 bit etc etc etc etc.
So... let's fire-up the win8 machine inside an Oracle Virtual Machine (free).
Connect to the test-library, datasheet view.. works ! yay ! Ok, connect to production library.. doesn't work ! Nay !!!
Wtf is going on ... ???! Same OS, same browser, same farm.. different libraries.. different views .
Googling this problem is a challenge, it's a case of .. you've got a certain problem which results in certain key-words (datasheet view + share point 2010 + failure) but you're actually NOT having the problem 90% of the people with these keywords have....I just hate it when this happens .. but also like it, because it's a challenge :)
So.. after some killing of search results, come across this one: http://social.msdn.microsoft.com/Forums/sharepoint/en-US/59b06220-3016-4c3f-819c-adb0fa16f7f0/issues-with-the-custom-filter-in-datasheet-view
Where dlanders1 shares his finding around the datasheet view and some influence on filtering ! (Thank You!)
So, I adjust the filter in the view, try datasheet again, works :)
Conclusion:
If you want to use datasheet view and you get the error: The Standard View of your list is being displayed because your browser does not support running ActiveX controls.
Don't pay to much attention to the ActiveX controls part of this error... You probably just need to make a simpler filter in the view !
Because there is so much irrelevant information out there, I'd thought to create a blog about this.. just add a possible relevant drop to the ocean!
Tuesday, June 25, 2013
Quick fix: Opening images by using the document ID service
"Why is this not working ?!!"
What ?
Well, I copy the document ID link (copy shortcut) and use this shortcut to create a link in a PDF document, but when I click on it, SharePoint (2010) gives me an overview of the properties, not showing me the image !
Ok, I'll look at it..
Turns out that the document ID retrieval page (/_layouts/DocIdRedir.aspx?ID=XXXXXX) is working for PDF or Office documents but a simple JPG file is not being shown. In stead, it shows the properties of the jpg file. So this:
And not this:
How to fix this ?
The document ID retrieval mechanism is connected to the enterprise search configuration.
To fix this issue, (and other document types that have the same behavior), go to your Central Admin -> Search Service Application Properties -> File Types configuration.
Make sure that the file type you are trying to display is registered in this list.
As a next step, you will need to clear your search index "Index Reset" link on same location in Central Admin. And execute a full crawl.
Immediatly after the full crawl I tried again, unfortunately it still wasn't working. But after an aditional hour of googling I tried again, this time it worked ! (Maybe the crawl said it was finished but it actually wasn't yet !).
Problemo Solved.
Because there is not much info on this out there, I decided to create a blog-post.
Thanks to this site for pointing me in the right direction !
What ?
Well, I copy the document ID link (copy shortcut) and use this shortcut to create a link in a PDF document, but when I click on it, SharePoint (2010) gives me an overview of the properties, not showing me the image !
Ok, I'll look at it..
Turns out that the document ID retrieval page (/_layouts/DocIdRedir.aspx?ID=XXXXXX) is working for PDF or Office documents but a simple JPG file is not being shown. In stead, it shows the properties of the jpg file. So this:
And not this:
How to fix this ?
The document ID retrieval mechanism is connected to the enterprise search configuration.
To fix this issue, (and other document types that have the same behavior), go to your Central Admin -> Search Service Application Properties -> File Types configuration.
Make sure that the file type you are trying to display is registered in this list.
As a next step, you will need to clear your search index "Index Reset" link on same location in Central Admin. And execute a full crawl.
Immediatly after the full crawl I tried again, unfortunately it still wasn't working. But after an aditional hour of googling I tried again, this time it worked ! (Maybe the crawl said it was finished but it actually wasn't yet !).
Problemo Solved.
Because there is not much info on this out there, I decided to create a blog-post.
Thanks to this site for pointing me in the right direction !
Wednesday, May 08, 2013
Using fun and interactive training to create better SharePoint user adaptation
This morning, I found a interesting link in my daily Yammer overview email:
Kirk Evans: I'm bringing Clippy back.
Create the Best App for Office 2013 in 5 Minutes - Kirk Evans Blog - Site Home - MSDN Blogs
Visiting this MSDN blog post, I soon saw the old clippy back on my screen ! This time, inside a browser window, using a JavaScript Libraries. Instantly my creative mind was going crazy ...
I'm currently battling to get end-users to work with the beautiful SharePoint 2010 intranet & document management system. I know there are may different reasons why users won't go to the intranet, lack of relevant content; fear for the new; no user training; no clear internal policies of what process (or document) goes where and who to share it with, etc. etc.
But what if..
The relatively inexperienced users (in a "work in a digital way kind of way") would be welcomed to the "new" intranet by a cute little puss in boots, showing them around, give a little animation while they execute their first search query. Giving tips & tricks, latest news (link) updates and maybe guide them trough the intranet's possibilities and procedures.
Since the "agents" are now completely accessible via JavaScript, it must be possible to integrate them into SharePoint (in this case 2010).
Let's do a proof of concept, and see what they users say !
Demo site: https://www.smore.com/clippy-js
Thanks Kirk, for bringing Clippy back !
Thursday, April 25, 2013
Manage deletion of index items (SharePoint Server 2010)
While re-configuring the search configuration and checking / validating the search functionality I discovered around 15.000 errors in the crawl-log.
The reason for this was that I changed the architecture for a specific site collection. In my first setup, I had a site collection with about 55 sub-sites. Each sub-site had it's own document library and security profile.
I already migrated about 15.000 documents in some of these sub-sites when I realized this structure wasn't going to work for me.
The maintenance and template creation of these sub-sites was so high, together with some problem in SharePoint with these sub-sites (I could not get the import-server to see the sub-sites to import the data in it, and some users complained they could not see the sub-sites even though they had access to it).
So, I decided to delete all sub-sites (thank god for powershell) and created 55 document libraries under the main site-collection.
Of course, the search crawlers already crawled the previous content (with sub-sites) and when executing the full crawl schedule now, it resulted in it not finding the 15K documents (and 15K errors!).
Thanks to this great TechNet article I found the correct properties to change in the SharePoint 2010 Search Service application. It was quite a challenge because of the many documented other problems with SP2010 Search (404 not found ->Loopback check, etc. etc.) but after a while I found the relevant information :
Delete policy for access denied or file not found
So it would have taken SharePoint 30 + 30 days (2 months !) before it would have removed the reference and resolve the error automatically...
I made the numbers a bit lower and saw this morning that the crawler has deleted 15K references :)
Next: to find out why the SearchContentAccess account can't access 2 site collections in a web-application that allows this account FULL READ policy access (and works for the other 50+ site collections..)
Cheers, Jeroen
Command to change the policy-parameter values:
The reason for this was that I changed the architecture for a specific site collection. In my first setup, I had a site collection with about 55 sub-sites. Each sub-site had it's own document library and security profile.
I already migrated about 15.000 documents in some of these sub-sites when I realized this structure wasn't going to work for me.
The maintenance and template creation of these sub-sites was so high, together with some problem in SharePoint with these sub-sites (I could not get the import-server to see the sub-sites to import the data in it, and some users complained they could not see the sub-sites even though they had access to it).
So, I decided to delete all sub-sites (thank god for powershell) and created 55 document libraries under the main site-collection.
Of course, the search crawlers already crawled the previous content (with sub-sites) and when executing the full crawl schedule now, it resulted in it not finding the 15K documents (and 15K errors!).
Thanks to this great TechNet article I found the correct properties to change in the SharePoint 2010 Search Service application. It was quite a challenge because of the many documented other problems with SP2010 Search (404 not found ->Loopback check, etc. etc.) but after a while I found the relevant information :
Delete policy for access denied or file not found
ErrorDeleteCountAllowed: Default value: 30
ErrorDeleteIntervalAllowed: 720 Hours (30 days)
ErrorDeleteIntervalAllowed: 720 Hours (30 days)
So it would have taken SharePoint 30 + 30 days (2 months !) before it would have removed the reference and resolve the error automatically...
I made the numbers a bit lower and saw this morning that the crawler has deleted 15K references :)
Next: to find out why the SearchContentAccess account can't access 2 site collections in a web-application that allows this account FULL READ policy access (and works for the other 50+ site collections..)
Cheers, Jeroen
Command to change the policy-parameter values:
$SearchApplication = Get-SPEnterpriseSearchServiceApplication -Identity "<SearchServiceApplicationName>" $SearchApplication.SetProperty("<PropertyName>", <NewValue>)
Tuesday, April 16, 2013
SharePoint 2010 "File already exists" backup error
Yesterday I asked the community to help me with a #wickedProblem regarding the SharePoint production farm backup. Basically it gave me 8 errors, all caused by one main error:
InvalidOperationException: Job "Search Backup and Restore (first phase backup) application b46e25e7-2307-4...restofid..-query-0 server [servername]" failed with error: The file '\\server\share$\spbr0000\b46e25e7-2307-4..sameasabove-query-0\projects\portal_content\indexer\cifiles\00010002.ci' already exists.
I created a blog post for this:
http://social.technet.microsoft.com/Forums/en-US/sharepointadminprevious/thread/5fa9c4cb-b121-4c95-b01f-01153291ec1c/#0cfce1bb-11c9-4839-b570-dbca48dec704
and thanks to Cornelius J. van Dyk I found out that my 'emptying the backup folder' before starting the backup, wasn't really effective because of the default "Don't show hidden files, folders, or drives" windows explorer setting.. Yup, changed the setting to actually 'see' all files made me able to 'delete' all files on the share :)
Ran the backup again and this time it finished:
InvalidOperationException: Job "Search Backup and Restore (first phase backup) application b46e25e7-2307-4...restofid..-query-0 server [servername]" failed with error: The file '\\server\share$\spbr0000\b46e25e7-2307-4..sameasabove-query-0\projects\portal_content\indexer\cifiles\00010002.ci' already exists.
I created a blog post for this:
http://social.technet.microsoft.com/Forums/en-US/sharepointadminprevious/thread/5fa9c4cb-b121-4c95-b01f-01153291ec1c/#0cfce1bb-11c9-4839-b570-dbca48dec704
and thanks to Cornelius J. van Dyk I found out that my 'emptying the backup folder' before starting the backup, wasn't really effective because of the default "Don't show hidden files, folders, or drives" windows explorer setting.. Yup, changed the setting to actually 'see' all files made me able to 'delete' all files on the share :)
Ran the backup again and this time it finished:
Finished with 0 warnings.
Finished with 0 errors.
Another problem solved :)
Tuesday, April 09, 2013
How I added 250.000+ documents and 3.5 Million pieces of metadata to 152 #SP2010 document libraries
The approach I took was the following:
First, created a separate site collection to hold the 8000+ documents, just from a content database point of view this made most sense.
After enabling all branding features on this site collection, I created a template document library which had the appropriate content type enabled, correct view and managed metadata navigation-configuration.
Next, I used a tool called DocKIT from VYAPIN software to create a basic-metadata-excel document from a file share which contained the documents that need to go to SharePoint.
http://www.vyapin.com/products/sharepoint-migration/dockit/sharepoint-migration.htm
This basic .XLS contains all file references (path) and per record (file), you can add the metadata you need to fill the content type in the destination location (library). Because it's all in excel (the tool reads from this document later), it's easy to copy and paste the metadata in the document / complete the metadata in bulk and offline.
Next, I created a document library per 'company-documents'; this was done mainly for security reasons. (Persons who have rights to company-A's document, don't necessarily have rights to company-B's documents. To manage these rights in bulk, separate document-libraries are used.
I started off by creating separate sites (sub-sites) per company (152 in total), using a powershell script to create the sub-sites, but I left this idea after some sites seemed to be inaccessible (just did not exist) for some users and for the import tool. Very strange behaviour from SP2010 but I didn't have time to do huge research on why this was happening.
The metadata.xls document contained the 'destination library location' for each document (again, a copy & paste within excel makes this definition very fast) and after completing all metadata for the 8000+ documents it was ready to use the file as import data for the DocKIT tool.
This tool reads the file from it's "Path" location, uploads it to the destination library on SharePoint, attaches the right content type to the file and applies the configured metadata, all automatically.
After completion, 8000+ documents were uploaded in 152 document libraries, mostly automated :)
Each document contains 14 pieces of metadata, in total more then 112000 metadata additions in SharePoint 2010.
I created the document libraries automatically, based on the data in the excel document (company name is a piece of metadata that was available as the document library name) , using the library template as one of the parameters.
I'm happy to share more details if need-be on the configuration of DocKIT or any of the other items described here :)
Update: using the same principle, I also created a "Public Domain Documents Silo" for this client, uploading 220.000 documents (mostly OCR'd PDF's), all with around 15 pieces of metadata attached. (I wrote a little C# program that got most of the existing information like folder structure into the metadata.xls document).
This site collection is fully search able (SharePoint 2010 search) and VERY fast, under 3 seconds. The nice thing is that because all documents have so much metadata, the search if fully refine able. Creating a great end-user experience.
Thursday, January 24, 2013
Error during SP2013 RTM installation
During the installation of SP2013 RTM I got the following error:
error writing to file: Microsoft.Office.Server.Search.dll verify that you have access to that directory.
I was using the SW_DVD5_SharePoint_Server_2013_64Bit_English_MLF_X18-55474.ISO mounted to Oracle VM.
After several attempts & re-installing Windows Server 2008 R2 Sp1 without any result, I decided to get a new .ISO file: en_sharepoint_server_2013_x64_dvd_1121447.ISO
Now the install run like a charm :)
error writing to file: Microsoft.Office.Server.Search.dll verify that you have access to that directory.
I was using the SW_DVD5_SharePoint_Server_2013_64Bit_English_MLF_X18-55474.ISO mounted to Oracle VM.
After several attempts & re-installing Windows Server 2008 R2 Sp1 without any result, I decided to get a new .ISO file: en_sharepoint_server_2013_x64_dvd_1121447.ISO
Now the install run like a charm :)
Wednesday, January 23, 2013
quick one... Extension...
Just a quick post... In my SharePoint deployment at a client, I included a little maintenance plan on the SQL sever.
After a few weeks I got a call from the client: DUDE ! Our SQL Server disk is FULL !
Checking the usual suspect (logfiles not cleared during backup) I quickly realized that the problem was the 'cleanup task' in the maintenance plan wasn't deleting the old SQL backup files.
Why ?
For file extension, I used the value ".bak", you shouldn't add the . in there.. just type "bak".
Problemo solved !
After a few weeks I got a call from the client: DUDE ! Our SQL Server disk is FULL !
Checking the usual suspect (logfiles not cleared during backup) I quickly realized that the problem was the 'cleanup task' in the maintenance plan wasn't deleting the old SQL backup files.
Why ?
For file extension, I used the value ".bak", you shouldn't add the . in there.. just type "bak".
Problemo solved !
Subscribe to:
Posts (Atom)