Monthly Archives: June 2008

June 2008 SUGDC Conference — That’s a Wrap

I attended my first ever SharePoint conference this past weekend and it was a blast. 

Thursday afternoon, I drove down to Virginia, guided by my newly purchased $50 GPS appliance plug-in thing to my phone.  The device was flawless.  After the five hour drive, I had the energy to do a nice run on the tread mill and then, even more surprisingly, had the energy to head to the lobby for an advertised speaker’s cocktail hour.  Conference n00b that I am, it turned out that the cocktail hour was really a ruse to get speakers to show up and help stuff papers and swag into shoulder bags for conference attendees 🙂 

Had a hard time sleeping because I was speaking first thing Friday AM.  Nervousness, a nagging feeling that I needed to add a slide to my presentation and a very disturbing cat show on Animal Planet kept me up late.  Since I went to sleep late, I naturally got up early.  I did add a fairly detailed technical architecture slide.  It was well worth the effort because the 25 minutes of Q&A would have been very awkward without it.  I was lucky to get the first slot in the technical track.  Sahil Malik was originally going to speak Friday AM and I was going to speak Saturday but he needed to swap times.  This allowed me to do my presentation and then sit back and enjoy everything going forward Friday and Saturday.

The presentation went OK.  I definitely have room to improve it.  I spoke about how we can access and use web services from a SharePoint Designer workflow using a custom action.  Over time, I will tie this information into my series over at EUSP.com for End Users trying to get the most use out of that tool.  I blew through my slides and demo in 35 minutes, to my dismay at the time.  Luckily, Q&A was lively, no doubt helped by the fact that it was early morning before lunch.  Q&A is my favorite part of any presentation. 

There were many interesting subjects and I hope to blog about them in greater detail this week (time permitting, as always).  A fellow from CMS Watch provided a highly critical yet very hopeful review of SharePoint’s position in the market.  A different discussion focused on the paucity of SharePoint resources and the difficulty that recruiters have finding good talent that is also "affordable" in this very tight market.  The CMS Watch guy referred to the SharePoint human resources pool as being like a "guild."  I’m mainly familiar with that term in MMORPG terms and it gave me a little thrill, to be honest 🙂 

The highlight of the conference was just meeting and catching up with people I’ve "known" online for a while.  The best was sitting at the bar with Becky Isserman (MossLover) for 3 or 4 hours (and that, after I had finished drinking for the night).  I don’t often get to talk about Farscape or Babylon 5 with Kansas City residents.

Bob Fox was there and as usual, is a whirlwind of intros, chats and just plain frenetic energy.  He invited me to Saturday breakfast with Sahil Malik and that was great. 

Saturday (day 2), Mike Lotter dragged himself to the conference to speak about InfoPath and then he joined Becky at the end of the day to do a sort of general Q&A session for about 30 to 45 minutes mainly focused on InfoPath (Mike) and AJAX (Becky).  I wish Becky had been able to go through her full/formal presentation but I’m sure I’ll get a chance to see that one of these days.  I have a feeling she’ll be "hitting the circuit" going forward.

I could go on and on.  Two last points — the financial purpose of the conference was to raise money for the Children’s Miracle Network and it raised $5,000.  That was awesome.  Finally, I want to publicly thank Gary Blatt, Gary Vaughn and Bob Fox for alerting me to and allowing me to speak at the conference.  Of course, the two Gary’s had a team of people supporting and organizing and all of you were awesome.  I had high expectations before I went and it was better than I had hoped for.

Keep on the alert for the next conference scheduled for November 7th and 8th.  Aside from some great content, it’s terrific for meeting up with all those online personalities you’ve known through blogs, twitter, forums, etc. 

</end>

 Subscribe to my blog.

Technorati Tags:

FBA and SQL Server: A Love Story

My colleague has been working on a web part in an FBA environment.  Among other things, the web part pulls some data from SQL server.  The grand plan for this project dictates that a DBA configures data level security in SQL (as opposed to embedding a user ID in a SQL query or some other approach).

The problem is that SQL server doesn’t know anything about our FBA environment so it can’t trust us.  We solved this problem by, for lack of a better word, manually impersonating an AD user so that we could connect to SQL such that SQL data level security works. 

Even though FBA is an ASP.NET feature, we SharePoint Nation people have taught the various search engines that if you’re querying for FBA, you must mean you want know how to configure FBA in SharePoint.  I failed to find find any information on how to enable an FBA oriented ASP.NET application to communicate with SQL in the way we needed. 

In the course of researching this, we re-read this article: ASP.NET Impersonation

More research led us to this codproject article: http://www.codeproject.com/KB/cs/cpimpersonation1.aspx

That helped us write our code, which I’ve included below.  It’s not the most elegant stuff, but it worked.  I hope you find it helpful.

Here’s the code that worked for us:

 

protected void btnSearchCarrier_Click(object sender, EventArgs e)
 {
 try
 {
 ImpersonateUser iU = new ImpersonateUser();
 // TODO: Replace credentials
 iU.Impersonate("DomainName", "UserName", "Password");

//
 CODE
//

 iU.Undo();
 }
 catch (Exception ex)
 {

 }
 }

// Using Impersonation class as mentioned below.

public class ImpersonateUser
 {
 [DllImport("advapi32.dll", SetLastError = true)]
 public static extern bool LogonUser(
 String lpszUsername,
 String lpszDomain,
 String lpszPassword,
 int dwLogonType,
 int dwLogonProvider,
 ref IntPtr phToken);

 [DllImport("kernel32.dll", CharSet = CharSet.Auto)]
 private extern static bool CloseHandle(IntPtr handle);

 private static IntPtr tokenHandle = new IntPtr(0);
 private static WindowsImpersonationContext impersonatedUser;

 // If you incorporate this code into a DLL, be sure to demand that it
 // runs with FullTrust.
 [PermissionSetAttribute(SecurityAction.Demand, Name = "FullTrust")]
 public void Impersonate(string domainName, string userName, string password)
 {
 try
 {

 // Use the unmanaged LogonUser function to get the user token for
 // the specified user, domain, and password.
 const int LOGON32_PROVIDER_DEFAULT = 0;

 // Passing this parameter causes LogonUser to create a primary token.
 const int LOGON32_LOGON_INTERACTIVE = 2;
 tokenHandle = IntPtr.Zero;

 // Step -1 Call LogonUser to obtain a handle to an access token.
 bool returnValue = LogonUser(
 userName,
 domainName,
 password,
 LOGON32_LOGON_INTERACTIVE,
 LOGON32_PROVIDER_DEFAULT,
 ref tokenHandle); // tokenHandle - new security token

 if (false == returnValue)
 {
 int ret = Marshal.GetLastWin32Error();
 Console.WriteLine("LogonUser call failed with error code : " +
 ret);
 throw new System.ComponentModel.Win32Exception(ret);
 }

 // Step - 2
 WindowsIdentity newId = new WindowsIdentity(tokenHandle);
 // Step -3
 impersonatedUser = newId.Impersonate();

 }
 catch (Exception ex)
 {
 Console.WriteLine("Exception occurred. " + ex.Message);
 }
 }


 /// <summary>
 /// Stops impersonation
 /// </summary>
 public void Undo()
 {
 impersonatedUser.Undo();
 // Free the tokens.
 if (tokenHandle != IntPtr.Zero)
 CloseHandle(tokenHandle);
 }
 }

</end>

Subscribe to my blog.

Technorati Tags:

Adding to the Lore: SSRS Tells Me “rsAccessDenied”, But … I Really DO Have Access

A few weeks back, I was working with my developer colleague on a project involving SQL Server Reporting Services plug-in for MOSS.  He was developing a web part that provides a fancy front-end to the report proper (the main feature being a clever lookup on a parameter with several thousand searchable values behind it).

This was working great in the development environment but in the user acceptance testing (UAT) environment, it wouldn’t work.  Firing up the debugger, we would see exception details like this:

The permissions granted to user ‘UAT_domain\mosssvc’ are insufficient for performing this operation.(rsAccessDenied).

If you do a live search on the above error, you find it’s quite common.  Scarily common.  The worst kind of common because it has many different potential root causes and everyone’s suggested solution "feels" right.  We probably tried them all.

In our case, the problem was that we had done a backup/restore of DEV to UAT.  Somewhere in the data, something was still referring to "DEV_domain" (instead of the updated "UAT_Domain").  We created a new site, added the web part and that solved our problem.

Hopefully this will save someone an hour or two down the line.

</end>

Subscribe to my blog.

Technorati Tags:

Quick Fix: Accessing SharePoint Site Throws [HttpException (0x80004005): Request timed out.]

One of my developer colleagues was working on a project this week and ran into a timeout problem while working on building some crazy web part.  His web part was fine, but "suddenly" an unrelated site became very slow and frequently timed out with this error:

[HttpException (0x80004005): Request timed out.]

I logged in and saw that several other sites were just fine.  I suspected that there were some hidden web parts on the page and using the trusty ?contents=1 debug technique, I did in fact find 11 web parts on the page, only two of which were visible.  Even better (from a let’s-hope-I-find-something-ugly-here-that-I-can-fix perspective), three of those closed web parts had a name of "Error".

I deleted those web parts (which itself took a surprisingly long time) and that solved the problem.  For today 🙂

</end>

Subscribe to my blog.

Technorati Tags:

In-class FAST Training is Excellent

I’m starting day 4 of FAST’s partner training headed up by Larry Kaye here in Needham, MA.

This 5-day session is broken down into classes (3 and 2 days respectively) entitled "FAST ESP: Developing Custom Search Applications for Alliance Partners I" and "FAST ESP: Developing Custom Search Applications for Alliance Partners II".

This is a real boot camp type class. The material is deep (very, very deep). The instructor (Larry) clearly knows his stuff. I highly recommend this training if you considering it. 

</end>

SharePoint and FAST — the Reese’s Peanut Butter Cups of Enterprise Apps?

I’ve finished up day 2 of FAST training in sunny Needham, MA, and I’m bursting with ideas (which all the good training classes do to me).  One particular aspect of FAST has me thinking and I wanted to write it down while it was still fresh and normal day-to-day "stuff" pushed it out of my head.

We SharePoint WSS 3.0 / MOSS implementers frequently face a tough problem with any reasonably-sized SharePoint project: How do we get all the untagged data loaded into SharePoint such that it all fits within our perfectly designed information architecture?

Often enough, this isn’t such a hard problem because we scope ourselves out of trouble: "We don’t care about anything more than 3 months old."  "We’ll handle all that old stuff with keyword search and going-forward we’ll do it the RIGHT way…"  Etc. 

But, what happens if we can’t scope ourselves out of trouble and we’re looking at 10’s of thousands or 100’s of thousands (or even millions) of docs — the loading and tagging of which is our devout wish?

FAST might be the answer.

FAST’s search process includes a lot of moving parts but one simplified view is this:

  • A crawler process looks for content.
  • It finds content and hands it off to a broker process that manages a pool of document processors.
  • Broker process hands it off to one of the document processors.
  • The document processor analyzes the document and via a pipeline process, analyzes the bejeezus out of the document and hands it off to an index builder type process.

On the starship FAST, we have a lot of control over the document processing pipeline.  We can mix and match about 100 pipeline components and, most interestingly, we can write our own components.  Like I say, FAST is analyzing documents every which way but Sunday and it compiles a lot of useful information about those documents.  Those crazy FAST people are clearly insane and obsessive about document analysis because they have tools and/or strategies to REALLY categorize documents.

So … using FAST in combination with our own custom pipeline component, we can grab all that context information from FAST and feed it back to MOSS.  It might go something like this:

  • Document is fed into FAST from MOSS.
  • Normal crazy-obsessive FAST document parsing and categorization happens.
  • Our own custom pipeline component drops some of that context information off to a database.
  • A process of our own design reads the context information, makes some decisions on how to fit that MOSS document within our IA and marks it up using a web service and the object model.

Of course, no such automated process can be perfect but thanks to the obsessive (and possibly insane-but-in-a-good-way FAST people), we may have a real fighting shot at a truly effective mass load process that does more than just fill up a SQL database with a bunch of barely-searchable documents.

</end>

Subscribe to my blog.

Technorati Tags: , ,

Learning About End Users At www.EndUserSharePoint.com

Mark Miller over at http://www.endusersharepoint.com has built, in my experience, the best end-user focused SharePoint site in the ‘sphere.  In the last month, he has enlisted some of the premier end-user focused bloggers around to contribute to the "front page" on a regular basis, including but not limited to Paul Culmsee, Chris Quick, and Dessie Lunsford.  He has others lined up and ready to contribute as their schedules allow.

I jumped on the chance to participate and my inaugural post is here.  I’m writing a series on how to use SharePoint Designer to create first-class business workflow solutions.  In keeping with the EUSP.com’s focus, those articles will always keep the End User front and center.

I personally tend to divide the SharePoint world into three broad groups:  SharePoint consultants, full-time SharePoint staff developers and end users.  When I write, I often ask myself, which of these groups might be interested in the subject?  Most often, I end up writing for the first two (technical) groups, mainly because I’m a consultant myself; it’s always easier and more authentic to write about those things with which you’re most familiar on a personal level. 

As I’ve noted before, the end user community is far, far larger than the technical community.  EUSP.com is top-notch and I heartily recommend it to all three groups.  The site’s laser focus is obviously valuable to end users.  However, we developers and consultants can only be better at our profession if we can understand and effectively respond to the needs of the end users we serve.  I know I need all the help I can get 🙂  Check it out.

</end>

Subscribe to my blog.

Invoking SSRS Web Services From WSS / MOSS in FBA Environment

We needed to invoke the "CreateSubscription" method on an SSRS web service that is hosted in an FBA managed MOSS environment from a custom web part.  We kept getting variations of:

  • 401: Not authorized
  • Object Moved

The "object moved" message was most interesting because it was saying that the "object" (our SSRS service) had "moved" to login.aspx.  This clearly meant we had some kind of authentication problem.

I eventually realized that I had bookmarked a blog entry by Robert Garret that described how to invoke a general purpose WSS/MOSS web service living inside an FBA environment.  Note that I can’t link directly to the article (as of 06/09/08) because it wants to authenticate.  The link I provide brings you to an "all posts" view and you can locate the specific article by searching for "Accessing MOSS Web Services using Forms Based Authentication".

Here’s the code that worked for us:

ReportingService2006 rs = null; 
// Authenticate Authentication auth = new Authentication(); 
auth.Url = "http://URL/_vti_bin/Authentication.asmx";
auth.CookieContainer =
new CookieContainer();
LoginResult result = auth.Login("userid", "password");
if (result.ErrorCode == LoginErrorCode.NoError) 
{
// No error, so get the cookies.
CookieCollection cookies = auth.CookieContainer.GetCookies(new Uri(auth.Url));
Cookie authCookie = cookies[result.CookieName];
rs =
new ReportingService2006();
rs.Url =
"http://server/_vti_bin/ReportServer/ReportService2006.asmx";
rs.CookieContainer =
new CookieContainer();
rs.CookieContainer.Add(authCookie);
}
try
{
  rs.CreateSubscription(report, extSettings, desc, eventType, matchData, parameters1);
}
catch (Exception ex)
{
  Console.WriteLine(ex.Message.ToString());
}

I interpret things to work like this:

  • Our web part needs to dial up the authentication service and say, "Hey, Tony, it’s me!".
  • Authentication service replies saying, "Hey, I know you.  How are the kids?  Here’s a token."
  • We call up the SSRS service and say, "Tony sent me, here’s the token."

</end>

Subscribe to my blog.

Have You Performed Your Monthly Search Analysis?

It’s a good practice, probably even a Best Practice, to review your search reports once a month and look for opportunities to add best bets, tune your thesaurus and maybe even uncover some business intelligence that is otherwise hidden to management. 

It’s already the 3rd of the month.  Time’s awastin’ 🙂

</end>

Subscribe to my blog.

Technorati Tags:

Faceted Search Fence Sitter No More

I had reason today to play about with the codeplex faceted search project today. 

It’s been around for a while, but I hesitated to download and use it for the usual reasons (mainly lack of time), plus outright fear 🙂

If you’re looking to improve your search and explore new options, download it and install it when you have an hour or so of free time.  I followed the installation manual’s instructions and it took me less than 20 minutes to have it installed and working.  It provides value minute zero.

It does look pretty hard to extend.  The authors provide a detailed walk-through for a complex BDC scenario.  I may be missing it, but I wish they would also provide a simpler scenario involving one of the pre-existing properties or maybe adding one new managed property.  I shall try and write that up myself in the next period of time.

Bottom line — in minutes, you can install, configure it, use it and add some pretty cool functionality to your vanilla MOSS search and be a hero 🙂

</end>

Subscribe to my blog.

Technorati Tags: