Category Archives: SharePoint Solutions Design

Capturing “mailto:” Metrics

I’m on a project where we need to collect metrics around a function named "Share a Story."  The idea is very simple — if you’re looking at an interesting article on the intranet and want to share it with someone, click a link labeled "Share this story" email it to your buddy.

We played around with a custom form for this purpose, but in the end, common sense won the day and we just use the familiar <a href=mailto:…> technique.  (<a href mailto:…> is a surprisingly robust little bit of HTML; as a bonus, that link brings me back to my old UNIX man pages days; those were the days!).

This technique provides a great interface for end users since they get to use their familiar MS Outlook client (or whatever email client they have installed).

It makes things harder on us poor developer types since they client *also* wants to run a report in the future that shows how often users share stories and even which stories are shared most often.

We whiteboarded a few potential solutions.  My favorite is to carbon copy (CC) a SharePoint list.  That way, the end user still gets the outlook client while we get to capture the event because we’ll get a copy of the email ourselves.  There are some obvious drawbacks.  The main problem is that the user could simply blank out or otherwise mangle the CC address.  And, we need to manage that event library of emails.  We have a scheduled job on the white board responsible for that cleanup.

If you have some clever approach to solving this problem, please do tell.

</end>

Subscribe to my blog.

Follow me on Twitter at http://www.twitter.com/pagalvin

Defining “Great” SharePoint Requirements

As requested and promised, I’ve uploaded my presentation on how to obtain "great" requirements from end users for SharePoint projects and implementations.  It’s here: http://cid-1cc1edb3daa9b8aa.skydrive.live.com/self.aspx/SharePoint/Paul%20Galvin%20Great%20Requirements.zip

I presented this at the SharePoint Best Practices conference in Feb 2009 (www.sharepointbestpractices.com).  If you attended the conference, you’ll also get this on the conference DVD.

The presentation includes a lot of notes with most slides.  It’s not just bullet points.

(See here for my other presentation on a governance case study: http://paulgalvin.spaces.live.com/blog/cns!1CC1EDB3DAA9B8AA!3099.entry

</end>

Subscribe to my blog.

Follow me on Twitter at http://www.twitter.com/pagalvin

Self-Service Site Creation Isn’t Exactly About Creating Sites

Like many SharePoint consultant types, I’ve been exposed to a lot of SharePoint functionality.  Some times, I dive pretty deep.  Other times I just notice it as I’m flying by to another set of menu options.  One of those is "self-service site creation."  I haven’t had a need for it until this week.

This week, I need to solve a business problem which I think is going to become more common as companies loosen up and embrace more direct end user control over SharePoint.  In this case, I’ve designed a site template to support a specific end user community.  Folks in this community should be able to create their own sites at will using this template whenever the urge strikes them.

I recalled seeing "self-service site creation" before and I’ve always tucked that away in the back of my head thinking that "self service site creation" is SharePoint lingo meaning, obviously enough, something like "turn me on if you want end users to be able to create sites when they want to."

So, I turn it on, try it out and for me, it’s not creating sites.  It’s creating site collections.   Pretty big difference.  That’s not what I want, not at all.

It is possible to let end users create new sub sites via a custom permission level.  This is exactly where I would have gone in the first place except that the label "self-service site creation" label deceived me.  Via twitter, I learn that it’s deceived others as well 🙂

I’m still working out how to provide a little bit of a more streamlined process while staying purely out of the box, but there’s a definite path to follow.  Just don’t get distracted by that label.

</end>

Subscribe to my blog.

Follow me on Twitter at http://www.twitter.com/pagalvin

Technorati Tags:

Spinning Up Temporary Virtual WFE’s for Fun and Profit

I was one of 20 or 30 (or maybe 100?) panelists last night at the New York SharePoint Users Group meeting.  Instead of the usual presentation format, this was all about Q&A between the audience and the panel members.  Early on, Michael Lotter introduced me to a new idea and I wanted to share.

An audience member described how his company had paid a consultant to write an application for his company.  The consultant wrote it as a console application using the SharePoint object model.  As a result, this meant that the program had to be run on a server in the farm.  This meant that anyone that wanted to use the app would have to log onto the server, do the work and log off.  At first, this wasn’t a problem, but soon, more and more (non-technical) users needed to use the utility.  His question was (paraphrasing):

"What are my options?  I don’t want to keep letting users log directly onto the server, but they need that functionality."

Michael Lotter suggested that he configure a new virtual machine, join it to the farm as a WFE and let users run the application from there. 

This is a pretty stunning idea for me.  Generalizing this solution brings to mind the notion of essentially temporary, almost disposable WFE’s.  I think it’s a pretty neat concept.  This temporary WFE can run a console application that uses the SharePoint object model.  You could also use it to run stsadm commands.  It doesn’t have to be part of regular local balancing.  If it goes down or gets wrecked, you can just spin up a new one.  I repeat myself, but I just have to say that I think it’s a really neat idea.

</end>

Subscribe to my blog.

Follow me on Twitter at http://www.twitter.com/pagalvin

Technorati Tags:

Large-scale MOSS Document Management Projects: 50k Per Day, 10 Million Total

This past week, someone asked a question about creating a SharePoint environment that would handle a pretty high volume of new documents (10,000 +/- in this case).  I don’t know much about this, but thanks to this white paper, I feel much better informed.

For me, this white paper is pretty much just a book mark at the moment, but I did start reading through it and thought I’d highlight my main take-away.  SharePoint can be scaled to handle, at a minimum, this load:

  • 50k new documents per day.
  • 10 million documents total.

I write the 50k/10MM figures because they are easy enough to remember.  As long as you know they are minimums, you won’t get into trouble.  The maximums are at least 10 percent higher than that and with extreme tuning, possibly a lot higher.

Thanks, Mike Walsh, once again for his weekly WSS FAQ updates and corrections post.  If you’re not subscribed to it, you should seriously think about doing it.

</end>

 Subscribe to my blog.

 

Saving Older MS Office Files to SharePoint Using WebDAV — Problems and Fixes

During the past week, my colleague and I were doing some work for a client in NYC.  We were testing a different aspects of a MOSS implementation using their "standard" workstation build (as opposed to our laptops).  While doing that, we ran into a few errors by following these steps:

  • Open up an MS word document via windows explorer (which uses WebDAV).
  • Make a change.
  • Save it.

We came to realize that some times (usually the first time) we saved the document, the save didn’t "stick."  Save did not save.  We would pull that document back up and our changes simply were not there. 

We didn’t understand the root issue at this point, but we figured that we should make sure that the latest MS Office service pack had been installed on that work station.  The IT folks went and did that.  We went through the test again and we discovered a new problem.  When we saved it, we now got this error:

image

This time, it seemed like every change was, in fact, saved, whether we answered Yes or No to the scripts question. 

We finally had a look at the actual version of Office and it turns out that the workstation was running MS Office 2000 with service pack 3 which shows up under Help -> About as "Office 2002".

The moral of the story: I will always use Office 2003 as my minimum baseline office version when using WebDAV and MOSS.

</end>

Subscribe to my blog.

Technorati Tags:

(For search engine purposes, this is the error’s text):

Line: 11807

Char: 2

Error: Object doesn’t support this property or method

Code; 0

URL: http://sharepoint01/DocumentReview/_vti_bin/owssvr.dll?location=Documents/1210/testworddocument.doc&dialogview=SaveForm

Do you want to continue running scripts on this page?

SharePoint Migration Tip: Use “untagged data” Views For Incremental Migration

In one or my very first blog posts, I described the overall process we followed to migrate a customer from SPS 2003 to MOSS.  A reader left a comment asking for more detail and here it is.

For that migration project, we had to find a good way to move a lot of SPS 2003 documents over to MOSS.  The initial load was easy enough.  Create a new target document library in MOSS and use windows explorer to move the documents.

This is the new document library:

image

Open up two windows explorers.  Point the first at SPS 2003 and the second at the new document library in MOSS.  The following screen shot shows this.  Note that the top browser is actually pointing at my c:\temp drive, but you can imagine it pointing to an SPS 2003 document library:

image

After that drag and drop operation, my target looks like this:

image

Now it’s time to deal with the metadata.  Assume we have just one column of metadata for these documents named "location."  We can see from the above "all documents" view that the location is blank.  It’s easy enough to use a data sheet view to enter the location, or even go into each document’s properties one by one to add a location.  Let’s assume that there is no practical way to assign the location column a value automatically and that end users must do this by hand.  Furthermore, let’s assume there are hundreds of documents (maybe thousands) and that it will take many many days to update the metadata.  As we all know, no one is going to sit down and work for four of five days straight updating meta data for documents.  Instead, they will break that out over a period of weeks or possibly longer.  To facilitate this process, we can create an "untagged data" view as shown:

image

Now, when someone sits down to spend their allocated daily hour or two to tag migrated documents, they can use the "untagged documents" view to focus their effort:

image

As users tag documents, they drop off this list.

This notion of an untagged data view can also help with a class of data validation problem people inquire about on the forums.  Out of the box, there’s no way to prevent a user from uploading a document to MOSS and then not enter meta data.  We can specify that a particular site column is mandatory and the user won’t be allowed to push the save button.  However, if the user uploads and then closes the browser (or uses windows explorer to upload the document), we can’t force the user to enter meta data (again, out of the box).

This approach can be used to help with that situation.  We can use a "poorly tagged data" view to easily identify these documents and correct them.  Couple this with a KPI and you have good visibility to the data with drill-down to manage these exceptional circumstances.

</end>

Subscribe to my blog.

Technorati Tags:

MOSS Small Farm Installation and Configuration War Story

This week, I’ve struggled a bit with my team to get MOSS installed in a simple two-server farm.  Having gone through it, I have a greater appreciation for the kinds of problems people report on the MSDN forums and elsewhere.

The final farm configuration:

  • SQL/Index/Intranet WFE inside the firewall.
  • WFE in the DMZ.
  • Some kind of firewall between the DMZ and the internal server.

Before we started the project, we let the client know which ports needed to be open.  During the give and take, back and forth over that, we never explicitly said two important things:

  1. SSL means you need a certificate.
  2. The DMZ server must be part of a domain. 

Day one, we showed up to install MOSS and learned that the domain accounts for database and MOSS hadn’t been created.  To move things along, we went ahead and installed everything with a local account on the intranet server. 

At this point, we discovered the confusion over the SSL certificate and, sadly, decided to have our infrastructure guy come back later that week to continue installing the DMZ server.   In the mean time, we solution architects moved ahead with the business stuff.

A weekend goes by and the client obtains the certificate.

Our infrastructure guy shows up and discovers that the DMZ server is not joined to any domain (either a perimeter domain with limited trust or the intranet domain).  We wasted nearly a 1/2 day on that.  If we hadn’t let the missing SSL certificate bog us down, we would have discovered this earlier.  Oh well….

Another day passes and the various security committees, interested parties and (not so) innocent bystanders all agree that it’s OK to join the DMZ server with the intranet domain (this is a POC, after all, not a production solution).

Infrastructure guy comes in to wrap things up.  This time we successfully pass through the the modern-day gauntlet affectionately known as the "SharePoint Configuration Wizard."  We have a peek in central administration and … yee haw! … DMZ server is listed in the farm.  We look a little closer and realize we broke open the Champaign a mite bit early.  WSS services is stuck in a "starting" status. 

Long story short, it turns out that we forgot to change the identity of the service account via central administration from the original local account to the new domain account.  We did that, re-ran the configuration wizard and voila!  We were in business.

</end>

Subscribe to my blog.

Technorati Tags:

Learning the Hard Way — DMZ WFE Must be in a Domain

Although it’s not literally true, as a practical matter, an internet-facing web front end in a DMZ must be in a domain (i.e. not some standalone server in its own little workgroup).  It doesn’t need to be in the same domain as the internal WFE(s) and other servers (and probably shouldn’t), but it needs to be a domain.

My colleagues and I spent an inordinate amount of time on a proposal which included SharePoint pre-requisites.  This included a comprehensive list of firewall configurations that would enable the DMZ server to join the farm and so forth.  Sadly, we failed to add a sentence somewhere that said, to the effect, "the whole bloody point of this configuration is to allow your DMZ WFE server, in a domain, to join the internal farm."

A perfect storm of events, where we basically looked left when we might have looked right, conspired to hide this problem from us until fairly late in the process, thus preventing me from invoking my "tell bad news early" rule.

Sigh.

Subscribe to my blog.

Technorati Tags:

Implementing Master / Detail Relationships Using Custom Lists

Forum users frequently as questions like this:

> Hello,
>
> Please tell me if there are any possibilities to build a custom list with
> master and detail type (like invoices) without using InfoPath.
>

SharePoint provides some out of the box features that support kinds of business requirements like that.

In general, one links two lists together using a lookup column.  List A contains the invoice header information and list B contains invoice details.

Use additional lists to maintain customer numbers, product numbers, etc.

Use a content query web part (in MOSS only) and/or a data view web part to create merged views of the lists.  SQL Server Reporting Services (SRS) is also available for the reporting side of it.

However, there are some important limitations that will make it difficult to use pure out-of-the-box features for anything that is even moderately complex.  These include:

  • Size of related lookup lists vs. "smartness" of the lookup column type.  A lookup column type presents itself on the UI differently depending on whether you’ve enabled multi-select or not.  In either case, the out-of-the-box control shows all available items from the source list.  If the source list has 1,000 items, that’s going to be a problem.  The lookup control does not page through those items.  Instead, it pulls all of them into the control.  That makes for a very awkward user interface both in terms of data entry and performance.
  • Lookups "pull back" one column of information.  You can never pull back more than one column of information from the source list.  For instance, you cannot select a customer "12345" and display the number as well as the customer’s name and address at the same time.  The lookup only shows the customer number and nothing else.  This makes for an awkward and difficult user interface.
  • No intra-form communication.  I’ve written about this here.  You can’t implement cascading drop-downs, conditionally enable/disable fields, etc. 
  • No cascading deletes or built-in referential integrity.  SharePoint treats custom lists as independent entities and does not allow you to link them to each other in a traditional ERD sense.  For example, SharePoint allows you to create two custom lists, "customer" and "invoice header".  You can create an invoice header that links back to a customer in the customer list.  Then, you can delete the customer from the list.  Out of the box, there is no way to prevent this.  To solve this kind of problem, you would normally use event handlers.

It may seem bleak, but I would still use SharePoint as a starting point for building this kind of functionality.  Though there are gaps between what you need in a solution, SharePoint enables us to fill those gaps using tools such as:

  • Event handlers.  Use them to enforce referential integrity.
  • Custom columns: Create custom column types and use them in lieu of the default lookup column.  Add paging, buffering and AJAX features to make them responsive.
  • BDC.  This MOSS-only feature enables us to query other SharePoint lists with a superior user interface to the usual lookup column.  BDC can also reach out to a back end server application.  Use BDC to avoid replication.  Rather than replicating customer information from a back end ERP system, use BDC instead.  BDC features provide a nice user interface to pull that information directly from the ERP system where it belongs and avoids the hassle of maintaining a replication solution.

    BDC is a MOSS feature (not available in WSS) and is challenging to configure. 

  • ASP.NET web form: Create a full-featured AJAX-enabled form that uses the SharePoint object model and/or web services to leverage SharePoint lists while providing a very responsive user interface.

The last option may feel like you’re starting from scratch, but consider the fact that the SharePoint platform starts you off with the following key features:

  • Security model with maintenance.
  • Menu system with maintenance.
  • "Master table" (i.e. custom lists) with security, built-in maintenance and auditing.
  • Search.
  • Back end integration tools (BDC).

If you start with a new blank project in visual studio, you have a lot of infrastructure and plumbing to build before you get close to what SharePoint offers.

I do believe that Microsoft intends to extend SharePoint in this direction of application development.  It seems like a natural extension to the existing SharePoint base.  Microsoft’s CRM application provides a great deal of extensibility of the types needed to support header/detail application development.  Although those features are in CRM, the technology is obviously available to the SharePoint development team and I expect that it will make its way into the SharePoint product by end of 2008.  If anyone has an knowledge or insight into this, please leave a comment. 

</end>