Saturday, June 14, 2008

6th Sense Analytics adds new features for collecting development productivity metrics

6th Sense Analytics, which collects and provides metrics on software development projects, this week announced several enhancements to its flagship product. These enhancements provide a more user-friendly interface and organize reports into workspaces that more closely align with the way each user works.

The Morrisville, N.C. company targets its products at companies that want to manage outsourced software development. It automatically collects and analyzes unbiased activity-based data though the entire software development lifecycle. [Disclosure: 6th Sense has been a sponsor or BriefingsDirect podcasts.]

Among the enhancements to the product are:
  • Reports can now be scheduled for daily, weekly or monthly delivery by email, reducing the number of steps required to access reports, providing easier integration into customer work routines.

  • Users can now select specific reports providing the ability to see only the information pertinent to their needs.

  • The registration process has been streamlined. After inviting a new user to a team, the user’s account is immediately activated and the user is sent a welcome email that provides details for getting started including instructions for desktop installation. The action of removing users has also been simplified.

  • Reports are now relevant to any time zone for customers working with resources across a country and on multiple continents.
I've been following 6th Sense Analytics since they first emerged on the scene. Last year, I had a podcast with Greg Burnell, chairman, co-founder and CEO, as he explained the need for metrics in outsourced projects. You can listen to the podcast here and read a complete transcript here.

Last August, I reported on the first metrics that 6th Sense Analytics had released to the public. Those findings confirmed things that people already knew, and provided some unexpected insights. I saw a real value in the data:

And these are not survey results. They are the use data aggregated from some 500 active developers over past several weeks, and therefore make a better reference point than “voluntary” surveys. These are actual observations are on what the developers actually did — not what they said they did, or tried to remember doing (if they decided to participate at all). So, the results are empirical for the sample, even if the sample itself may not yet offer general representation.

No comments:

Post a Comment