Common issues which arise when you use Import-SPMetadataWebServicePartitionData

Import-SPMetadataWebServicePartitionData is a Powershell command that is useful when moving an entire Term Store between environments. This may be between Dev / QA / Staging / Production environments, or between multiple farms in a distributed environment.

It is important to note that this is an “all or nothing” process. Importing will completely overwrite your existing term store, so proceed with caution and make sure you take good backups before you run this command.

ISSUE: You cannot find the ID of your MMS service application
This is needed to perform the export and import of your MMS data.
 
FIX #1:
Browse to your term store, and get the TCID parameter (GUID) from the query string in the page address.

FIX #2:
Use Powershell to find the correct value:

$taxonomySession = Get-SPTaxonomySession -site $taxonomySite
$termStore = $taxonomySession.TermStores[$termStoreName] 
$termStoreID = $termStore.ID

This can then be passed in as the -Identity parameter to Import-SPMetadataWebServicePartitionData

ISSUE: You do not have permission to use the bulk load statement

This is due to a missing permission on the MMS Service Application account, not the account running the Powershell script. In a least privilege set up, you will find that the BULK INSERT role is not configured for the user.

FIX: Add the BULK ADMIN role to the MMS Service Application account in SQL, either via Powershell or via the SQL Management Studio.

ISSUE: Cannot bulk load because the file “C:\99f6833d2bac4c53af26b816afca1d55\ECMGroup.dat” could not be opened

This is due to where the process is running, and affects environments where the SQL server and the SP server are on different machines. The Powershell command does some work on both the SP server *and* the SQL server, so both need to be able to use the file.

FIX: Create a shared location to store the CAB file on the SQL server, accessible by the SP server, and use this in your -Path parameter to the Powershell command.

ISSUE: Access denied

This is due to the user account that the MMS service application is running as not having permissions on the shared folder that the CAB file is stored in.

FIX: Ensure that the shared location is set to allow read AND write permissions for the MMS service account.

Hopefully this post will help other people debug what is happening when this step fails.

International SharePoint Conference – London 2012

Often, developers new to SharePoint can get lost in the new world they encounter – with strange, obscure acronyms and terminology. There is a steep learning curve for developers and team leads starting their first development project in SharePoint.
 
At the International SharePoint Conference in London last week, the developers track did something different. We broke away from the traditional session based approach to concentrate creating a single solution on the SharePoint platform. We took input from the IT Pro and Business tracks and worked to explore challenges to building projects in SharePoint from the ground up. Concentrating on helping attendees with the choices that they will have to make when designing and building a solution, the conference aimed to give developers clear guidance on the important processes and decisions that make the difference between success and failure on SharePoint projects. We started with nothing but a list of requirements – to build a Knowledge Base solution – and ended the 3 days by publishing the solution on a live instance of Office 365.
 
Taking us out of the traditional 60 -75 minute time slot had its fair share of challenges. Would the attendees want to sit through 3 whole days on the same scenario? What would happen if the servers broke half way through the conference? How do you get a diverse group of MCAs, MCMs and MVPs to agree on anything? Looking at the reactions on Twitter and verbally at the conference, the new approach worked very well.
 
The 16 sessions covered topics such as customizing Visual Studio, the building blocks of the solution (web parts, branding, content types etc), packaging and deployment, advanced features (Managed Metadata, Business Connectivity Services, Search, Word Automation) and integration with cloud services (SQL Azure, Azure worker roles and Office 365).
 
The nine speakers (Andrew Connell, Ben Robb, Eric Shupps, Matthew McDermott, Mirjam van Olst, Paul Schaeflein, Todd Carter , Waldek Mastykarz  and Wictor Wilén), together with support from the ITPro track (especially Steve Smith and Spence Harbar) worked hard for nearly 6 months to bring this project to fruition, and announced the publication of the code base onto Codeplex at the conference. You can download the various projects which make up the solution here: http://spkbase.codeplex.com

Best Practice Conference DC

I’m putting the final finishing touches to my sessions at this year’s Best Practices Conference in Washington, D.C, and I’m really looking forward to catching up with lots of old friends from across the pond – especially since so many couldn’t make it over to SP Evolution because of the Ash Cloud in April. The line up this year is absolutely fantastic, with lots of MVPs, MCMs and other experts making the time to put on one of the biggest SharePoint conferences of the year.

Best Practices Conference 2010

I’m doing 3 sessions:

  • Building search driven internet applications using FAST and SharePoint WCM
  • Build a SharePoint 2010 publishing website in 60 minutes
  • Extending Web Analytics reports to provide campaign / goal driven websites SharePoint 2010

3 sessions in 3 days, plus an Ask the Experts sessions is going to be hard going, but should be good fun too. I’ll be posting supporting material up during the conference, so keep tuned.

Demo content: importing the complete works of Shakespeare to SharePoint

We often find that we need a fair amount of sample content to really showcase some extended WCM and FAST scenarios, and found that lorem ipsum and other automated text generation just wasn’t cutting it. As someone with a keen interest in the theatre, where better to turn to than Shakespeare’s plays, especially since I found out that someone went to the trouble of marking up all 37 plays in XML

There were a number of challenges to work through:     

  1. The XML I downloaded had DTD references that I couldn’t be bothered working through how to make work via the System.Xml.XmlDocument object, so I went in and deleted these in all of the files. This made processing easier. 
  2. The structure I chose to implement this in was a series of webs beneath a root web, with Root / Plays / [Play name] / [Act name]. Each act web has pages beneath it.
  3. I needed to convert the data in each scene to XHTML to push into the PageContent field.
  4. It takes about an hour on my VM to process all 37 plays – resulting in around 130 webs and nearly 1000 SharePoint publishing pages.

With this is mind I had the following code:  

$xmlpath = “C:\SPBPC2010\Assets\shaks200″
$xmlpath | ls -Filter “*.xml” | % { Process-Play -play ([xml] (Get-Content $_.VersionInfo.FileName)) -parentweb $playsWeb }

This iterates through the folder that contains the XML files for the plays, and calls “Process-Play”.             

function Process-Play([xml] $play, [Microsoft.SharePoint.SPWeb] $parentWeb)
{
Write-Host “Processing play: ” $play.PLAY.TITLE 

## 1. Create the web for the PLAY
$name = Convert-ToSafeString $play.PLAY.TITLE
$playWeb = New-SPWeb (“{0}/{1}” -f $playsWeb.Url, $name) -name $play.PLAY.TITLE -Template $basetemplate
 
## 2. Process the ACTS
$play.PLAY.ACT | % { Process-Act -act $_ -web $playWeb}
 
Write-Host “Finished processing play: ” $play.PLAY.TITLE
Write-Host “——————————————————-“
}
 

Here, we are using a function called “Convert-ToSafeString” to get rid of spaces, quotes and other characters which will cause issues in a URL, and then create a new web for that play. We then walk down through the XML ($play.PLAY.ACT) and call “Process-Act”. This does a very similar thing, and also calls “Process-Scene”:$act.SCENE | % { Process-Scene -scene $_ -web $actWeb}

Powershell is great for doing this kind of iterative processing, and is really a time saver when it comes to writing quick scripts to do this kind of task.

Process-Scene is where we start actually putting in some content. For each scene, we want to create a Publishing Page and then add content to the Page Content field: 

function Process-Scene([System.Xml.XmlElement] $scene, [Microsoft.SharePoint.SPWeb] $web)
{
Write-Host “Processing the scene: “$scene.TITLE
## 1. Create the page
 
$pubweb = Get-SPPublishingWeb -web $web
$PageLayout = Get-SPPublishingPageLayout -web $web -name $basepagelayout
$pagename = Convert-ToSafeString ($scene.TITLE.Substring(0, $scene.TITLE.IndexOf(“.”)))
$pagecollection = [Microsoft.SharePoint.Publishing.PublishingPageCollection] $pubweb.GetPublishingPages()
$page = New-SPPublishingPage -PageCollection $pagecollection -name $pagename -pagelayout $pagelayout -title $scene.TITLE
 
## 2. Field: PageContent
$pagecontent = Transform-Xml -xsl $sceneXsl -xml $scene
$page.ListItem.Set_Item(“PublishingPageContent”, $pagecontent)      
 
## 5. Update and Publish
$page.Update()
$page.ListItem.Update()
$page.CheckIn($true)
$page.ListItem.File.Publish($true)
 
}
Here you can see we are creating a new publishing page [I have a utility function for this – it isn’t out of the box], and calling another utility function “Transform-Xml” to set the value of the PublishingPageContent field to the transformed content.
 
These utility functions are here: 
## ==============================================================================
## UTILITY FUNCTIONS
## ==============================================================================
 
function Get-SPPublishingWeb( [Microsoft.SharePoint.SPWeb] $web )
{
<#
       .SYNOPSIS
              Gets a SharePoint publishing web
       .EXAMPLES
                    
#>
       return [Microsoft.SharePoint.Publishing.PublishingWeb]::GetPublishingWeb($web)
}
 
## ==============================================================================
 
function Get-SPPublishingPageLayout(
       [Microsoft.SharePoint.SPWeb] $web,
       [string] $name)
{
<#
       .SYNOPSIS
              Gets a publishing page layout
       .EXAMPLES
                    
#>
       $pubWeb = Get-SPPublishingWeb($web)
       return $pubWeb.GetAvailablePageLayouts() | ? { $_.Name -eq $name}   
}
 
## ==============================================================================
 
function New-SPPublishingPage(
       [Microsoft.SharePoint.Publishing.PublishingPageCollection] $PageCollection,
       [string] $Name,
       [Microsoft.SharePoint.Publishing.PageLayout] $PageLayout,
       [string] $Title)
{
<#
       .SYNOPSIS
              Creates a publishing page in the SPPublishingPageCollection supplied
       .EXAMPLES
                    
#>
Write-Host “Creating page: $PageName”
       $newPage = $PageCollection.Add(($Name + “.aspx”), $PageLayout)
       $newPage.Title = $Title   
       $newPage.Update()
       $newPage.ListItem.Update()
 
       return [Microsoft.SharePoint.Publishing.PublishingPage] $newPage
}
 
## ==============================================================================
 
function Convert-ToSafeString([string] $s)
{
       return $s.Replace(” “, “-“).Replace(“‘”, “”).Replace(“,”,””)
}
 
## ==============================================================================
 
function Transform-Xml([System.Xml.Xsl.XslCompiledTransform] $xsl, $xml)
{
       $sw = New-Object System.IO.StringWriter
       $xsl.Transform($xml, $null, $sw) | Out-Null
       Write-Output $sw.ToString()
}

Manipulating the Terms Store through Powershell

I really like Powershell, for its repeatability, the fact that it means I know that that once I’ve worked out how to do something I can save it in my stash of modules for later, and the way that I can spin up lots of different scenarios very quickly.

For my session on “Build a website in 60 minutes” at SP Evo, I spent a fair amount of time pulling together scripts for manipulating term store data quickly. I could, I guess, have just used the out of the box “import CSV file” option from the UI. But that won’t cut it when going between multiple environments in the real world. I need to be able to do it from my build scripts, and I couldn’t find any way via the API to import from a CSV – it seems that code is locked into the web UI, not baked into the core APIs.

There are other posts out there for how to set up the term store in the first place, both manually and via Powershell, so I’ll not repeat them here. But assuming you have a term store available, here is some very straightforward Powershell to start populating it:

Assuming you have an XML file in this format:

<?xml version="1.0"?>
<termstore name="Managed Metadata Service Application Proxy">  
  <group name="SPEvo">
    <termset name="Services">
      <term name="Customer Engagement">
        <term name="Analytics">
        </term>
        <term name="Search Engine Optimisation">
        </term>
        <term name="Engagement Strategies">
        </term>
      </term>
      <term name="Technology">
        <term name="Technical Build and Delivery">
        </term>
        <term name="Technical Consultancy">
        </term>
        <term name="Technical Design">
        </term>
      </term>
      <term name="User Experience">
        <term name="Creative Design">
        </term>
        <term name="Information Architecture">
        </term>
      </term>
    </termset>
  </group>
</termstore>

You should be able to manipulate that like this:

 $TermStoreData = [xml] (Get-Content ($xmlpath))
 $site = Get-SPSite $url
 $session = new-object Microsoft.SharePoint.Taxonomy.TaxonomySession($site)
 $termstore = $session.TermStores[$TermStoreData.termstore.name]
 $TermStoreData.termstore.group |
 ForEach-Object {
  ## create the group
  if ($termstore.Groups[$_.name] -eq $null)
  {
   $group = $termstore.CreateGroup($_.name);
   Write-Host -ForegroundColor Cyan "Added group $_.name"
   $_.termset |
   ForEach-Object {
    ## create the termset
    $termset = $group.CreateTermSet($_.name)
    Write-Host -ForegroundColor Cyan "Added termset $_.name"
    SetTermsRecursive -termsetitem $termset -parentnode $_
   }
  }
 }
 $termstore.CommitAll()

Some of the “power” of “Powershell” is in its ability to load up standard .NET objects; in this case there are no out of the box cmdlets to add term store data, so we have to make use of “new-object” to load up the TaxonomySession object. From that we are into native SharePoint code – walking down the XML and creating Term Store Groups, Term Sets and then ultimately Terms. The terms in this example are deployed in a recursive function:

function SetTermsRecursive ([Microsoft.SharePoint.Taxonomy.TermSetItem] $termsetitem, $parentnode)
{
 $parentnode.term |
 ForEach-Object {
  ## create the term
  $newterm = $termsetitem.CreateTerm($_.name, 1033)
  Write-Host -ForegroundColor Cyan "Added term $_.name"
  SetTermsRecursive($termsetitem, $_)
 }
}

Pretty neat? Some core Powershell abilities are showcased here:

  • Walking through XML DOM structures via the “dot” shorthand notation. For example, $TermStoreData.termstore.group is equivalent to $TermStoreData.SelectNodes(“/termstore/group”) – the dot notation means less typing and more readable code.
  • Pipelines – for example, the ForEach-object and pipeline associated $_ syntax.

These can be difficult to read in examples, but I strongly advise anyone starting out in Powershell to get to grips with these, particularly the piping model.

Anyway, the end result is that in a few lines of fairly simple Powershell you can take an XML file and build up a term store with data as part of your deployment model. I’ll post later about why that is important if you ever want to use those term sets in, say a site column declaration…

Creating SharePoint Page Layouts as features in 2010

You have hacked together a proof of concept in SharePoint Designer, and now you need to start working for real in SharePoint features. In 2007, you had little choice other than knowing the syntax and working out the values for the associated content type.  

What we are looking for is something like: 

<File Path=”MyPageLayout.aspx” Url=”MyPageLayout.aspx” Type=”GhostableInLibrary”>
    <Property Name=”Title” Value=”My Page Layout”/>
    <Property Name=”ContentType” Value=”$Resources:cmscore, contenttype_pagelayout_name”/>
    <Property Name=”PublishingAssociatedContentType” Value=”#;MyContentType;[a really long id];#”/>
</File>

Ideally, we’d like some nice powerful mechanisms from within the Visual Studio IDE for generating this stuff. I’m working on that as part of a submission to the CKS:DEV team, but in the meantime we do have another solution. In 2010, you now have Powershell. And look what it can do for you… 

First, we need to be able to get the page layout in question from a supplied SPWeb and name.

function Get-SPPublishingPageLayout(
    [Microsoft.SharePoint.SPWeb] $web,
    [string] $name)
{
    $pubWeb = Get-SPPublishingWeb -web $web 
    return $pubWeb.GetAvailablePageLayouts() | ? { $_.Name -eq $name
}

Next, we need to format all those bits into the right structure:

function Get-XmlForPageLayout([string] $name)
{
 
$layout = Get-SPPublishingPageLayout  -web (Get-SPWeb $siteurl) -name $name
 
return (@”<File Path=”{0}” Url=”{0}” Type=”GhostableInLibrary”>
    <Property Name=”Title” Value=”{1}”/>
    <Property Name=”ContentType” Value=”$Resources:cmscore, contenttype_pagelayout_name”/>
    <Property Name=”PublishingAssociatedContentType” Value=”#;{2};{3};#”/>
</File>
“@ -f “PublishingModule\$name”, $layout.Title, $layout.AssociatedContentType.Name, $layout.AssociatedContentType.Id)
} 

This now makes it really easy to get the format you need for your elements.xml file. All you need to do now is call this function:

$siteurl = http://www.fabrikam.com&#8221;
Get-XmlForPageLayout  -name “homepage.aspx”

Now all you need to do is cut and paste that output from your Powershell command into Visual Studio.

SPEvo slides

SharePoint Evolution was a blast, and even though a lot of speakers didn’t make it over, everyone pulled together. All credit to the Combined Knowledge team, especially Steve Smith and Zoe Watson for not breaking under the pressure of #ashtag.

Like many others (especially Spence Harbar and Eric Shupps), I did a number of extra sessions, and I’ve uploaded three of the decks to Slideshare.

Building high scale, highly available websites in SharePoint 2010

Introduction to the Client OM in SharePoint 2010

Build a SharePoint website in 60 minutes

Customising the SharePoint management UI – just say no!

I was recently asked at an event how to make changes to some of the core management UI features of SharePoint 2010. The question was specifically around how to make a change to the Silverlight control you get when you click on “Create” in the “View all site content” page.

What you get when you click on that link now, v4’s “Fluent UI” in place is in full swing, and a nifty Silverlight control which looks very similar to the Backstage found in the new Office clients.

Add Gallery in SharePoint 2010

Backstage in Word

My response to clients who ask for the management UI in SharePoint to be customised [beyond what you get from master pages and themes] is to ask them: “What is the business value that doing so will add?”. Without clear business drivers to making these changes, they quickly become a drain on the project, and over time will add considerable cost to the ongoing support of the solution.

Spence Harbar had a great post on this back in the 2007 days, and much of what he said then still holds. You get higher TCO by changing core features like this – on-going maintenance, support, training and so on will all be higher, and there is a higher chance that you will be locked into your delivery partner for a longer time frame. Also, chances are that you will be hit by additional testing and development as service packs and other hotfixes get released.

So I’d push back when you see this kind of requirement come in. If the client understands the cost, the additional risk, and the downsides, to customising the UI, then fine, but don’t walk in blind to that kind of scenario.

Semantic mark-up for menus

In SharePoint 2007, if you were building a tightly designed website, chances are that you were faced with a problem when using the out of the box (or ASP.NET) menu controls. Before ScottGu’s team were converted into believing that standards compliance was a good thing, the ASP.NET 2.0 controls came up woefully short when it came to clean mark up, relying on nested tables to get the layout right.   

SharePoint 2007 continued this theme, wrapping their Microsoft.SharePoint.WebControls.AspMenu control with more tables, and worse, marking it so that you couldn’t inherit from the control to solve the rendering issue. Thankfully, there was a work around using control adapters to delegate the Render() method.   

But it was still a pain. The configuration files for your control adapter have to live in the file system, in the App_Browsers directory, which means that a WSP cannot deploy them to your farm. And they sometimes introduce performance penalties in high scale sites. And in the case of menu controls, there was business logic wrapped into the original Render() method which was hard to reproduce, since it called internal methods.   

How has the story improved in 2010? ASP.NET 4.0 does solve that problem, but SharePoint 2010 will not be shipping with support for ASP.NET 4.0 – that will have to wait for a service pack. But all is not lost: the SharePoint product team has made a significant investment in standards compliance and providing better quality mark-up. An example is the new UseSimpleRendering property on the AspMenu control. This Boolean flag is now all you need to set if you want navigation list to be rendered using <li> tags.    

So let’s check what output we get with that property set.  With UseSimpleRendering=”false”, the output is the traditional output that 2007 had, with nested tables and inline styles, both frowned on by the web designer community. 

<table id="zz1_TopNavigationMenuV4" class="menu zz1_TopNavigationMenuV4_2" 
cellpadding="0" cellspacing="0" border="0">
  <tr>
    <td onmouseover="Menu_HoverStatic(this)" onmouseout="Menu_Unhover(this)" 
    onkeyup="Menu_Key(this)" id="zz1_TopNavigationMenuV4n0">
      <table cellpadding="0" cellspacing="0" border="0" width="100%">
        <tr>
          <td style="white-space:nowrap;">
            <a class="zz1_TopNavigationMenuV4_1" href="/" accesskey="1">Home</a>
          </td>
       </tr>
     </table>
   </td>
   <td style="width:3px;"></td>
   <td onmouseover="Menu_HoverStatic(this)" onmouseout="Menu_Unhover(this)" 
   onkeyup="Menu_Key(this)" id="zz1_TopNavigationMenuV4n1">
     <table cellpadding="0" cellspacing="0" border="0" width="100%">
       <tr>
         <td style="white-space:nowrap;">
           <a class="zz1_TopNavigationMenuV4_1" href="/about-us/Pages/default.aspx">About us</a>
         </td>
       </tr>
     </table>
   </td>
   <td style="width:3px;"></td>
 </tr>
</table> 

With  UseSimpleRendering=”true”, the output is now much cleaner: 

   
<div id="zz15_TopNavigationMenuV4" class="menu">
  <div class="menu horizontal menu-horizontal">
    <ul class="root static">
      <li class="static">
        <a class="static menu-item" href="/" accesskey="1">
          <span class="additional-background">
            <span class="menu-item-text">Home</span>
          </span>
        </a>
      </li>
      <li class="static selected">
        <a class="static selected menu-item" href="/about-us/Pages/default.aspx">
          <span class="additional-background">
            <span class="menu-item-text">About us</span>
            <span class="ms-hidden">Currently selected</span>
          </span>
        </a>
      </li>
    </ul>
  </div>
</div>

 Note the lack of tables, the fact that there are no inline styles, and that accessibility has clearly been taken care of with the inclusion of specific textual clues that a particular item is selected.

Rock on!

Slides from SUGUK session on BI

I presented at the London SharePoint User Group (SUGUK-London) last Thursday, on what is coming in the 2010 wave of products around BI. There was a great turnout and a lot of new (and familiar) faces in the audience.

Obviously SharePint afterwards was also fun :)

Some people have asked for a copy of my slides: SUGUK Business Intelligence slide deck

Follow

Get every new post delivered to your Inbox.