Monthly Archives: May 2008

SharePoint Migration Tip: Use “untagged data” Views For Incremental Migration

In one or my very first blog posts, I described the overall process we followed to migrate a customer from SPS 2003 to MOSS.  A reader left a comment asking for more detail and here it is.

For that migration project, we had to find a good way to move a lot of SPS 2003 documents over to MOSS.  The initial load was easy enough.  Create a new target document library in MOSS and use windows explorer to move the documents.

This is the new document library:

image

Open up two windows explorers.  Point the first at SPS 2003 and the second at the new document library in MOSS.  The following screen shot shows this.  Note that the top browser is actually pointing at my c:\temp drive, but you can imagine it pointing to an SPS 2003 document library:

image

After that drag and drop operation, my target looks like this:

image

Now it’s time to deal with the metadata.  Assume we have just one column of metadata for these documents named "location."  We can see from the above "all documents" view that the location is blank.  It’s easy enough to use a data sheet view to enter the location, or even go into each document’s properties one by one to add a location.  Let’s assume that there is no practical way to assign the location column a value automatically and that end users must do this by hand.  Furthermore, let’s assume there are hundreds of documents (maybe thousands) and that it will take many many days to update the metadata.  As we all know, no one is going to sit down and work for four of five days straight updating meta data for documents.  Instead, they will break that out over a period of weeks or possibly longer.  To facilitate this process, we can create an "untagged data" view as shown:

image

Now, when someone sits down to spend their allocated daily hour or two to tag migrated documents, they can use the "untagged documents" view to focus their effort:

image

As users tag documents, they drop off this list.

This notion of an untagged data view can also help with a class of data validation problem people inquire about on the forums.  Out of the box, there’s no way to prevent a user from uploading a document to MOSS and then not enter meta data.  We can specify that a particular site column is mandatory and the user won’t be allowed to push the save button.  However, if the user uploads and then closes the browser (or uses windows explorer to upload the document), we can’t force the user to enter meta data (again, out of the box).

This approach can be used to help with that situation.  We can use a "poorly tagged data" view to easily identify these documents and correct them.  Couple this with a KPI and you have good visibility to the data with drill-down to manage these exceptional circumstances.

</end>

Subscribe to my blog.

Technorati Tags:

SharePoint Wildcard Search: “Pro” Is Not a Stem of “Programming”

On the MSDN search forum, people often ask a question like this:

"I have a document named ‘Programming Guide’ but when I search for ‘Pro’ search does not find it."

It may not feel like it, but that amounts to a wildcard search.  The MOSS/WSS user interface does not support wildcard search out of the box.

If you dig into the search web parts, you’ll find a checkbox, "Enable search term stemming".  Stemming is a human-language term.  It’s not a computer language substring() type function.

These are some stems:

  • "fish" is a stem to "fishing"
  • "major" is a stem to "majoring"

These are not stems:

  • "maj" is not a stem to "major"
  • "pro" is not a stem to "programmer"

The WSS/MOSS search engine does support wild card search through the API.  Here is one blog article that describes how to do that: http://www.dotnetmafia.com/blogs/dotnettipoftheday/archive/2008/03/06/how-to-use-the-moss-enterprise-search-fulltextsqlquery-class.aspx

A 3rd party product, Ontolica, provides wild card search.  I have not used that product.

</end>

Subscribe to my blog.

Technorati Tags:

Logging Workflow Activity in SharePoint Designer

Last week, I was working out how to loop and implement a state machine using SharePoint Designer and mentioned, as an aside, that I would probably write a blog post about better workflow logging.

Well, Sanjeev Rajput beat me to it.  Have a look

Saving log data into a custom list seems superior to using the regular workflow history:

  • It’s just a custom list, so you can export it to excel very easily.
  • You can create views, dynamically filter the data, etc.
  • It’s not subject to the auto-purge you get with regular workflow history.

There are some risks / downsides:

  • Many running workflows with a lot of logging could cause too much data to be written to the list.
  • Maybe you *do* want automatic purging.  You don’t get that feature with this approach (without coding).
  • Security is tricky.  In order to write to the list, the user must have permission to do so.  That means that it’s probably not suitable for any kind of "official" audit since the user could discover the list and edit it.  This could be overcome with some custom programming.

</end>

Technorati Tags:

Subscribe to my blog.

The Trouble With Tribbles … err .. KPIs

This past week I finished off a proof of concept project for a client in Manhattan.  While implementing the solution, I ran into another shortcoming of MOSS KPIs (see here for a previous KPI issue and my workaround).

Background: We used SharePoint Designer workflow to model a fairly complex multi-month long business process.  As it chugged along, it would update some state information in a list.  KPIs use this data to do their mojo.

We decided to create a new site each time a new one of these business processes kicks off.  Aside from the workflow itself, these sites host several document libraries, use audience targeting and so forth.  Just a bunch of stuff to help with collaboration among the internal employees, traveling employees and the client’s participating business partners.

We also wanted to show some KPIs that monitor the overall health of that specific business process as promoted by the workflow state data and viewed using the KPIs.

Finally, we used KPI list items that do a count on a view on a list in the site (as opposed to pulling from another data source, like excel or SQL).

The Problem: As you can imagine, assuming we were to carry the basic idea forward into a production world, we would want a site template.  Provision a new site based off a "business process" template.

The problem is that you can’t seem to get a functioning KPI that way.  When I create a new site based on a template with a KPI List and KPI web part, the new site’s KPI data are broken.  The new site’s KPI list points at whatever source you defined when you first saved it as a template.

By way of example:

  • Create a new site and build it to perfection.  This site includes the KPI data.
  • Save that as a template.
  • Create a new site and base if off the template.
  • This new site’s KPI list items’ sources point to the site template, not the current site.

The instantiation process does not correct the URL.

I tried to solve this by specifying a relative URL when defining the KPI list item.  However, I couldn’t get any variation of that to work.

I always want to pair up these "problem" blog posts with some kind of solution, but in this case I don’t have a good one.  The best I can figure is that you need to go in to the newly provisioned site and fix everything manually.  The UI makes this even harder because changing the URL of the source list causes a refresh, so you really have to redefine the whole thing from scratch.

If anyone knows a better way to handle this, please post a comment.

</end>

Technorati Tags:

MOSS Small Farm Installation and Configuration War Story

This week, I’ve struggled a bit with my team to get MOSS installed in a simple two-server farm.  Having gone through it, I have a greater appreciation for the kinds of problems people report on the MSDN forums and elsewhere.

The final farm configuration:

  • SQL/Index/Intranet WFE inside the firewall.
  • WFE in the DMZ.
  • Some kind of firewall between the DMZ and the internal server.

Before we started the project, we let the client know which ports needed to be open.  During the give and take, back and forth over that, we never explicitly said two important things:

  1. SSL means you need a certificate.
  2. The DMZ server must be part of a domain. 

Day one, we showed up to install MOSS and learned that the domain accounts for database and MOSS hadn’t been created.  To move things along, we went ahead and installed everything with a local account on the intranet server. 

At this point, we discovered the confusion over the SSL certificate and, sadly, decided to have our infrastructure guy come back later that week to continue installing the DMZ server.   In the mean time, we solution architects moved ahead with the business stuff.

A weekend goes by and the client obtains the certificate.

Our infrastructure guy shows up and discovers that the DMZ server is not joined to any domain (either a perimeter domain with limited trust or the intranet domain).  We wasted nearly a 1/2 day on that.  If we hadn’t let the missing SSL certificate bog us down, we would have discovered this earlier.  Oh well….

Another day passes and the various security committees, interested parties and (not so) innocent bystanders all agree that it’s OK to join the DMZ server with the intranet domain (this is a POC, after all, not a production solution).

Infrastructure guy comes in to wrap things up.  This time we successfully pass through the the modern-day gauntlet affectionately known as the "SharePoint Configuration Wizard."  We have a peek in central administration and … yee haw! … DMZ server is listed in the farm.  We look a little closer and realize we broke open the Champaign a mite bit early.  WSS services is stuck in a "starting" status. 

Long story short, it turns out that we forgot to change the identity of the service account via central administration from the original local account to the new domain account.  We did that, re-ran the configuration wizard and voila!  We were in business.

</end>

Subscribe to my blog.

Technorati Tags:

Mea Culpa — SharePoint Designer *CAN* Create State Machine Workflows

I’ve recently learned that it’s possible and even fairly easy to create a state machine workflow using SharePoint Designer.  Necessity is the mother of invention and all that good stuff and I had a need this week that looked for an invention.  Coincidentally, I came across this MSDN forum post as well.  My personal experience this week and that "independent confirmation" lends strength to my conviction.  I plan to write about this at greater length with a full blown example, but here’s the gist of it:

  • The approach leverages the fact that a workflow can change a list item, thereby triggering a new workflow.  I’ve normally considered this to be a nuisance and even blogged about using semaphores to handle it.
  • SharePoint allows multiple independent workflows to be active against a specific list item.

To configure it:

  • Design your state machine (i.e., the states and how states transition from one to the next).
  • Implement each state as separate workflow.
  • Configure each of these state workflows to execute in response to any change in the list item.

Each state workflow follows this rough pattern:

  • Upon initialization, determine whether it should really run by inspecting state information in the "current item".  Abort if not.
  • Do the work.
  • Update the "current item" with new state information.  This triggers an update to the current item and fires off all the state workflows.

Aside from the obvious benefit that one can create a declarative state machine workflow, all that state information is terrific for building KPIs and interesting views.

It does have a fairly substantial drawback — standard workflow history tracking is even more useless than normal 🙂  That’s easily remedied, however.  Store all of your audit type information in a custom list.  That’s probably a good idea even for vanilla sequential workflow, but that’s for another blog post 🙂

I call this a "mea culpa" because I have, unfortunately, said more than once on forums and elsewhere that one must use visual studio to create a state machine workflow.  That simply isn’t true.

</end>

 Subscribe to my blog.

Technorati Tags:

Learning the Hard Way — DMZ WFE Must be in a Domain

Although it’s not literally true, as a practical matter, an internet-facing web front end in a DMZ must be in a domain (i.e. not some standalone server in its own little workgroup).  It doesn’t need to be in the same domain as the internal WFE(s) and other servers (and probably shouldn’t), but it needs to be a domain.

My colleagues and I spent an inordinate amount of time on a proposal which included SharePoint pre-requisites.  This included a comprehensive list of firewall configurations that would enable the DMZ server to join the farm and so forth.  Sadly, we failed to add a sentence somewhere that said, to the effect, "the whole bloody point of this configuration is to allow your DMZ WFE server, in a domain, to join the internal farm."

A perfect storm of events, where we basically looked left when we might have looked right, conspired to hide this problem from us until fairly late in the process, thus preventing me from invoking my "tell bad news early" rule.

Sigh.

Subscribe to my blog.

Technorati Tags:

If You Haven’t Tried Twitter …

Twitter is a very odd duck.  I’ve been using Twitter for a little over a month and in some indefinable way, it’s almost as important to me as email.  I find myself vaguely unsettled if I wait too long before looking over what others are twittering about.  I get annoyed at Twitter’s occasional performance problems because it means I’m missing out.  I get a little puff of excitement when I see a new Woot announcement.

It’s a real community builder in a way that really complements blogs and forums and even personal face to face meetings.

In the last month, I’ve followed one person’s attempts at shaking a cold while trying to manage a Seder.

I’ve learned personal detail about many folks I mainly "know" through blogs — where they live, the kind of projects they work on, that they have a work / family issues to manage just like me.

One person’s mother passed away … a sad event for sure.  But sharing that fact changes and enhances the character of the whole experience.

That’s just the personal stuff.

There’s more to it than that.  It’s also another medium for sharing ideas, or more often I think, seeking help.  Throw a question up on Twitter and you’re never left hanging and the responses typically arrive within minutes.

If you haven’t tried it, you should really give it a go. 

Look me up at http://www.twitter.com/pagalvin

Subscribe to my blog.

Technorati Tags: