Josh Elman Facebook Connect Presentation at SDForum

A coworker passed on this link as an introduction to Facebook Connect.

This YouTube video is a bit rough when it comes to production value. The lecture was presented in a restaurant. The video appears to have been shot without a tripod, so there is a lot of unsteadiness. Don’t let this put you off – the content is good.

The presentation is 38 minutes long. Below is a rough outline of the presentation:

  • Rationale for Facebook Connect
  • Identity
  • Friends
  • Feeds
  • Privacy
  • How it Works
  • UI Components
  • Demo / Sample Code (code is not legible in the video)

 

 

Zune Outage – I Can Relate

On the last day of 2008 – a leap year – some models of the Zune were reported as not working. When the date rolled over to January 1, 2009, all was fine. Even though I was not impacted by this bug, I can relate.

At my second job out of college, our product was hit by a similar bug. The product used both Gregorian and Julian calendars (I do not recall why…). We had done all of the normal leap year tests around February 29 for the year in question. On New Years Eve and New Years Day, our support lines lit up. Some month end, quarter end and year end processes were failing. Sure enough, they did not expect 366 days in this year.

Ever since that time, I always wanted to have a set of systems that were running some number of months in the future so I could discover these misses before our customers did.

 

2008 Testers Choice Awards

While working through my backlog of reading, I came across the results of Software Test & Performance magazine’s Testers Choice Awards for 2008. This list is of no use to me.

First, look at most of the winners – LoadRunner, TestDirector, components of IBM Rational suite and other high priced, commercial products. Working at a startup now (and several in the past), there is rarely budget for these tools. Oftentimes, the tools imply a process or a level of maturity that do not exist in the first couple of years of a startup’s existence. A more useful breakdown would be a set of testers choice awards for open source tools and a set of testers choice awards for commercial tools. (There is a category for free test/performance tool but that is about as broad as giving an award for best vertebrate – there is just too much variation.)

Which is actually the second point, because of the diversity of the software development space, a generic list such as this does not shed light on tools that may be applicable to my particular area. This is similar to an example James Surowiecki used in The Wisdom of Crowds regarding the generic top 10 lists in music. There may be only 1 or no songs in the top 10 list for the genre I am interested in and most likely I am already familiar with the song. If I am interested in that genre, the generic top 10 list only tangentially touches my interests and does not supply any new information. If I am looking for new songs like the one I like in the generic list, the other entries may not help. However, a top 10 list of songs in the specific music type (the more specific the better) may lead me to new songs which I will enjoy. The same is true when it comes to tools. A “testers choice award for java code coverage tools” might point me to tools I am not familiar with while the category “java test/performance” does not point me to any tools that I am not already familiar. Knowing which code coverage tool for java others consider the best is much more useful than knowing LoadRunner won the java test/performance category for 2008.