Sunday, November 21, 2010

TechNet Virtual Lab: SharePoint RTM - IT PRO - Service Applications

Objectives
This lab’s objective is to gain familiarity with

- Configure the new Managed Metadata Service Application
- Associate the Managed Metadata Service Application with a Web Application
- Manage the Metadata Service by adding your own custom Groups and Term Sets
- Import a group into the Enterprise Term Store within the Metadata Service
- Utilize the Managed Metadata Service Application within a list
- Configure My Site settings
- Create a My Site



Adding, Configuring, and Consuming the Managed Metadata Service Application

Scenario
In this exercise you will add, configure, and consume the managed metadata service application

Task 1: Creating a New Managed Metadata Service Application

Lol @ welcome message:




Creating a new managed meta service:






Cewl new features:
- choose failover database
- choose content hub website (Enter the URL of the site collection (Content Type hub) from which this service application will consume content types.)

Task 2: Managing the Metadata Service


Lol @ Do not click the link itself but anywhere in the row to select it ...nice UI !!

Creating Groups, Term Sets, Terms
Task 3: Adding a Group by Importing it
(importing a CSV, easy peasy)

Task 4: Consuming the Terms from a List



Hmm missing some interface options.. this doesn't look like:



Maybe it's because of the very small screen... (which I can't get any bigger).

Solution: press the manage views icon first, create column will come after that.


Selecting the managed metadata as column type



Selecting the pre-defined enterprise values




Review People Configuration and My Sites

Scenario
In this exercise you will add, configure, and consume the managed metadata service application


Task 1: Review People configuration




Task 2: Configure My Sites


While reviewing this screen, you will notice that when the Farm Configuration Wizard created the People Service Application, it configured the http://intranet.contoso.com Web Application to host My Sites. The My Site Host site collection was created at http://intranet.contoso.com:80/my, and the My Sites will be hosted under the managed path “my/personal.” The only thing left is to confirm that Self Service Site Creation is on.



10. In the Ribbon, click Self-Service Site Creation.
(!this is under the SECURITY icon)

! Clicking on the 'my content' link is a challange; as this is white.. but just click behind the my newsfeed link and it will open the screen below)



(this 'my content' link isn't shown:)



my profile:



Nice 'twitter' integration ?? :)

Task 3: Conclusion

In this lab you spent some time learning how to use and configure the new Managed Metadata Service, as well as configure My Site creation options. You should now be able to…
- Create a new Managed Metadata Service Application (MMSA)
- Associate the Service Application with the Portal Web Application
-Manage the Metadata Service, add a group by import, and consuming the terms from within a list
- Configure My Site creation settings for domain users
- Create a My Site

For more information on SharePoint Server 2010, visit the SharePoint Products and Technologies team blog at http://blogs.msdn.com/sharepoint for regular announcements and updates.

Microsoft Codename Atlanta (part2)

In part 1 I ended with my request to the network administrator for opening up port 445 to the Atlanta test server.
The next day I got a mail saying I didn't want to open port 445 but ssl, port 443 okay (maybe network admin had checked firewall log requests?).
After testing general internet connection I tried to install the Atlanta monitor service which worked !
Unfortunately the service won't start. So probably there need to be more ports opened to the cloud...

TechNet Virtual Lab: SharePoint RTM - IT PRO - Upgrade

Lab Source: http://technet.microsoft.com/en-us/bb512933.aspx

Tasks:
- Verify existing 2010 farm readiness to upgrade specific content databases using the 2010 Test-SPContentDatabase cmdlet.
- Initiate upgrade for individual content datbases using the 2010 STSADM –o addcontentdb command.
- Review the upgrade session status using the improved Central Administration web site Upgrade Status page.
- Initiate upgrade for multiple individual content databases using multiple PowerShell sessions to trigger parallel upgrade sessions.
- Troubleshoot an upgrade failure due to missing features, and how to restart upgrade for individual content database
- Use visual upgrade features to switch sites from the 2007 product look and feel to the new 2010 product user interface.

The Test-SPContentDatabase cmdlet generates a nice report with all possible technical flaws for converting the 2007 SharePoint content database to SharePoint 2010.

Task 2: Create a new Web Application for Restore
Central admin ->manage web applications
In the ribbon, click new icon at the left, a Create new web application window will open.
(creating new web application in SharePoint 2010) (which takes like forever !!)

Task 3: Test Existing MOSS 2007 Content Databases





Testing:
(From Sharepoint2010 Management Shell)
Test-SPContentDatabase -Name WSS_Content_1 -WebApplication http://moss.contoso.com

(Gives a dos-box with all problems and solutions)



Exercise 2 - Gain Upgrade System Familiarity
(which turns out to be running just 1 command line in dos box, very simple actually !)

Task 2: Upgrade Two Content Databases at the Same Time
4. Wait for both command windows to close before continuing. Each window will close automatically. It is normal for one window to close before the other. This script may take approximately 5 to 8 minutes to complete. Progress can also be monitored in each of the command windows opened.
(it takes a while :P P)

Exercise 3 - Troubleshoot Upgrade Failures and Resume Upgrade

Scenario
In this exercise you will troubleshoot an upgrade issue by reviewing the upgrade log and then installing a missing solution and restarting upgrade from the command line.


Task 1: Open Failed Upgrade Log

Task 2: Find Errors in the Upgrade Log
(Opening logfile with notepad (copy & paste)..)

Task 3: Install Missing Solutions and Features
! .ps1 file is not in the F40 folder but 1 folder higher.

oops… doesn’t work..


Exercise 4 - Explore Visual Upgrade Features

5. Look at the page, noting that the user interface is still using the MOSS 2007 look and feel. The content database that contains this site collection was initially attached and upgraded with the -PreserveOldUserExperience option set to true, which is the default behavior.

Old interface takes forever to load..
Finally results into ‘An unexpected error has occurred.’









Task 3: Use the Visual Upgrade Preview

(still waiting for the old interface).
. In this lab, you learned how to…
• Verify existing 2007 farm and content upgrade readiness through the use of the 2007 pre-upgrade checker command.
• Verify existing 2010 farm readiness to upgrade specific content databases using the 2010 Test-SPContentDatabase cmdlet.
• Initiate upgrade for individual content databases using the 2010 STSADM –o addcontentdb command.
• Review the upgrade session status using the improved Central Administration web site Upgrade Status page.
• Initiate upgrade for multiple individual content databases using multiple PowerShell sessions to trigger parallel upgrade sessions.
• Troubleshoot an upgrade failure due to missing features, and how to restart upgrade for individual content database.
• Use Visual Upgrade features to switch sites from the 2007 product look and feel to the new 2010 product user interface.

Thursday, November 11, 2010

Microsoft Codename Atlanta (part1)

monitoring your SQL Servers by the specialist and at the specialist!
That sounds really fantastic and like the future ! No more complex and extensive installation and configuration of SCOM 2007 in the organization, SQL Monitoring in the CLOUD !! wow.. that sounds really really good, time to get hands-on :D :D
http://technet.microsoft.com/en-us/library/ff962512.aspx

Step1: Registration easy peasy with windows live ID

Step2: Installation -> certificate: easy
SCOM Agent client bit of a hassle since I tested SCOM environment before, this client can’t upgrade
the previous installed client and exits with an error.
Uninstalled everything with SCOM in it, tried again and success It now installed the agent.

Next up: Install the gateway “A gateway is an internet-connected server on your network”.
Here the fun starts… :


Tried to use the proxy server configuration: no success.. Asked the network admin, he says: you can try to download the Ms Firewall Client for ISA server; installed (reboot of course) and re-tried but
unfortunately this Gateway Settings setup can’t connect to the cloud.

Which makes me wonder how I can connect to the ‘cloud’ behind a firewall…

Finding some more documentation on the Microsoft support pages reveals:
http://onlinehelp.microsoft.com/en-us/atlanta/ff962505.aspx
Communication between the Atlanta agent and gateway

In order to enable communication between agents and gateways, the following requirements must
be met. Note that this configuration is enabled by default, so no specific action should be required.

Both the agent and the gateway need to be in trusted domains.
Gateway:

Ensure that the Server service (lanmanserver) is enabled.
Open TCP port 445 for SMB inbound traffic.

Agent:
Ensure that the Server service (lanmanserver) is enabled.
Ensure that TCP port 445 is open for outbound SMB traffic. This port is open by default.
euh.. not on our firewall ..
(Just posted the request to the network admin …)

Friday, October 29, 2010

(performance) issue solving by pinguins

Skipper, look.

Analysis.

Looks like a small bulb used to indicate
something unusual, like a malfunction.

I find it pretty and somewhat hypnotic.

That too, sir.

Right! Rico?

Manual!

(hit’s the light with manual)

Problemo solved.

- We may be out of fuel.
- Why do you think so?

We've lost engine one...

...and engine two is no longer on fire.

Buckle up, boys.

(thanks)

Friday, March 19, 2010

SSIS Traps

Okay, so our 8 year old sql server 2000 database server died..
At least, the SCSI RAID failed / all databases were migrated so IT Infra turned it off... There was only one little DTS which was turned off with it..

So, after a few weeks I got a call: I put my excell in the specified folder, but the website doesn't get updated ..

Off to the server room, turning on the server, going back to desk; still no reply on my ping.. back to server room again .. finding a monitor & keyboard: RAID FAILURE, press F1 to continue.. okay: F1 :P one drive disabled, the fun still continued :)

Win 2000 started, I Immediatly started the sql managment studio, found the DTS and started to taking pictures (server had no network connection) with my SLR :)

Okay, easy peasy; just a little DTS reading an CSV and putting the information in the database (luckly another SQL server, not this broken one).

5 mins work, at most....... right...............

Ofcourse, it's 2010 now, DTS is no more ! It's SSIS !! New & Funstuff!!
Lets quickly re-create this little DTS in SSIS :) :)
Oh, need Visual Studio Business Development Studio (1000 MB) to do this.. Okay, have SQL 2008 R2 CTP right here !! Lets install !! (see install issues previous blog.. )

So, I installed SQL 2008 R2, seeing NOTHING in the program files, no management studio, no BI Development Studio no nothing;
Starting install again ... checked all neccesary components again, it began installing .. again.. After install I got the tools !!

So lets get started ! (5 mins are WAAAAAAAAYYYYY gone already at this phase.. ;)

First trap:
Connections ->Data Sources to create new data source and re-use
(using new connection from datasource in connection manager (bottom screen))

Using a Excute SQL Task & Data Flow Task

Making connection with the CSV:
After double click on the Data Flow Task choose 'flat file source' from the toolbox and drag it on the main screen. Double click it and fill in the path and create new Flat File Connection.

Here comes the second trap

DTS did by default a truncation of data which did not fit into a column. (Or maybe it didn't care about column with at all); SSIS DOES care about column with, so the default with of 50 is maybe a little tight and easy overlooked.
In my case I had one column with 689 chars and so SSIS turned red for the first time real soon giving a truncation error.
At first I changed all the column widths to 255 / 689 exact according to the destination dbase data-types. The truncation errors stayed as if nothing happened.
So, apperantly I had a column somewhere which was > 255, nice..

I wasn't feeling like checking 4000 rows what the max column with was..
So, googling.. found you can change SSIS's behaviour on checking / failing on this. I'ts configured in double click on the flat-file choose error output ->change the truncation column from 'Fail component' to 'Ignore failure'. Second problem fixed.

Third trap
As I'm importing the data into a SQL Server, it might look obvious to choose 'SQL Server Destination' as a destination for the CSV file. This is another trap :)
As it turns out to be, the name 'SQL Server Destination' implies that you want to do a 'BULK INSERT' which is damn fast blablabla BUT not supported on remote sql server's. So you SSIS must be on the same server as the destination dbase.
Which, in my case insn't so. Luckly SSIS gives an very descriptive error about that :) So I choose a OLE DB Destination :)
Third trap fixed :)

Fourth trap
Column mapping. Remembering the good old days where my DTS just used to discover the dbase columns and csv file columns and automatically map those.. SSIS doesn't seem to understand col001 - Column 01 mapping ... So, changing those names 34 times ... and again in the mappings screen, 34 times.. sigh.. did I say 5 mins ??

Okay, renamed the columns, automap works ! (no need to draw 34 lines... :P )

Pressing play.... GREEN ! It actually turned green !! omg !!
Only one problem.. my data is in beteween " .. "bla1","bla2" etc
And my SSIS nicely places it in dbase with "bla1"
:) ooh that's an easy one, I saw 'text qualifier' option in the Flat File Source options screen; Placing an " as a text qualifier, press play again -->RED

resulting in Fourth Trap
if you use the " as text qualifier the SSIS gives an error saying it can't qualify the last column. I fixed this using the Flat File Connection Manager Editor.
In the 'advanced' pane, go to the last column and change the value for the last column from TextQualified true to false; Problem fixed :)
Pressing play, GREEN again :) but already noticed Fifth Trap.

As i'm doing full import everytime I need to truncate the table before filling it with fresh data. So I already put an truncate table SQL task in my SSIS with a nice green arrow to the DataFlow task. And this works fine, only because of the Fourth Trap I noticed this is not really best practice, resulting in an empty table when something goes wrong in the DataFlow task. So I need a transaction based execution.
In the 'Execute SQL Task' I changed the property 'TransactionOption' from Supported to Required. This resulted in the Sixth Trap

Transactions
changing this option immediatly resulted into a RED SSIS with the error
[Execute SQL Task] Error: Failed to acquire connection "Bla". Connection may not be configured correctly or you may not have the right permissions on this connection.

(oh by the way, we're entering the second day in this '5 min' time-path now..)
Googling for this error and for enabling transactions in SSIS; One option you should enable is 'RetainSameConnection' = True in the shared datasource properties
This didn't help. So googling futher..

After few hrs of googling, trying I think I must look into the DTC issues.
I looked ito the book 'Microsoft SQL Server 2008 Integration Services, page 510; Which gives a sample without the use of the DTC; Implemented this and.. it works !
I even didn't specify a failure path, it just works out of the box; defining an TSQL Statment 'BEGIN TRANSACTION' and 'COMMIT TRANSACTION'.
Just to be safe I also defined a 'ROLLBACK' on the failure paths :)
BE AWARE that all tasks use the same Connection Manager. In addition to this, you must make sure that the RetainSameConnection property on the Connection Manager is set to True ! (page 510).

Seventh Trap
The failure paths are default on an 'Logical AND. All constraints must evaluate to True' setting. Which implies there can only be one failure path. So I changed this to the OR mode, which makes all red lines go dotted and works great :)

Eight Trap
Now this thing works on my dev-machine, got to get this on production sql ! Trying to publish to a sql 2005 server is NOT going to work. So, publish it to a sql 2008 production SSIS enabled server.. euh.. we ain't got those.. sigh..
So, I just publish to my own dev-sql2008 and asked IT infra to create a production version :)

Last trap
Okay, just before publishing I discovered an error. As it seems, my solution in 'Fourth Trap' isn't working.. The last column still has data surrounded by "blabla" in the resultset. I spent the rest of the afternoon googling for solutions but only found same issues. Now asking for a change in the export format, for a different column or data delimiter..

[Flat File Source [1]] Error: The column delimiter for column "Column 31" was not found.
-->prob. I'm having this issue: http://social.msdn.microsoft.com/Forums/en-US/sqlintegrationservices/thread/a2ed5d71-b749-47ed-abb2-d745b0a2020a

To be continued.....
Resources:
http://ssisblog.replicationanswers.com/2008/01/10/escaping-double-quotes-in-flat-files-ssis-is-different-to-dts.aspx

Installing SQL Server 2008 R2 November CTP

Okay, apparently you need to hack your registry to get over the 'reboot your system' issue during the install process... :(

To fix this:
search your registry for: PendingFileRenameOperations
and delete all keys you find !

No reboot required, just do a 'rescan' and this issue is solved :)


Thanks: http://social.msdn.microsoft.com/Forums/en-US/sqlsetupandupgrade/thread/988ab9e3-26ce-48da-ad61-c458f5c9c539