Friday, August 3, 2007

IBM adds to 'information on demand' drive with Princeton Softech acquisition

IBM is beefing up its "information on demand" initiative with the announcement today of its intention to acquire Princeton Softech, Inc.

Princeton's Optim cross-platform data management software will provide a big boost in meeting the needs of data governance, as well as controlling costs from an increase in data volumes. This becomes a primary corporate concern in light of estimates that storage management may soon represent nearly 50 percent of an annual IT budget.

As organizations are required to retain data longer for auditing, cost becomes an issue if archival data remains on operational systems, eating up storage capacity and degrading performance. Princeton's archiving offerings helps remove the data from those systems, while allowing it to still be accessible and usable.

The other prong of regulatory requirements comes with security of data, especially customer information that is deemed private. In addition to the costs of maintaining huge amount of historical data on operational systems, the potential penalties from exposing private customer data can be daunting. Princeton's data-masking capability is designed to preserve data integrity and efficient archiving.

Princeton also provides test data management software that creates test databases, in which sensitive customer data can be masked and the underlying data protected from corruption during the tests.

The acquisition is one of a long string of smaller, often private companies that IBM has been buying to fill out its data lifecycle offerings. As we've said before, getting your data act together is an essential aspect of being able to move to SOA. This purchase seems to buttress that approach.

Princeton Softech, with 240 employees, is privately held, and has been in operation since 1989. No financial details were disclosed for the deal, which needs regulatory approval. Both companies hope the acquisition will be complete within the next two months.

OpenSpan report card: Plays well with others

Quick question: What is the most-used technology for integrating apps on the desktop? If you said "copy-and-paste," then you'd be right, and it probably means you've been listening to Francis Carden, CEO of OpenSpan Inc.


Carden uses the copy-and-paste statistic to emphasize how little integration has advanced in the industry, despite all the effort of the last two decades.

OpenSpan of Alpharetta, Ga., offers what it claims is a new and unique way to integrate the multitude of currently siloed apps on which many operations rely today. How OpenSpan works is that it identifies the objects that interact with the operating system in any program -- whether a Windows app, a Web page, a Java application, or a legacy green screen program -- exposes those objects and normalizes them, effectively breaking down the walls between applications.

The OpenSpan Studio provides a graphical interface in which users can view programs, interrogate applications and expose the underlying objects. Once the objects are exposed, users can build automations between and among the various programs and apply logic to control the results.

For example, the integration can be designed to take information from legacy applications, pass it to an Internet search engine, perform the search, and pass the result on to a third application. All this can be done transparently to the end user with a single mouse click. The operation can run with the applications involved open on the desktop, or they can run in the background while the integration runs in a dashboard.

Setting up such an integration in OpenSpan can take anywhere from a few minutes to a few days, depending on the complexity of the operation and the number of programs involved.

What happens if the objects in one of the source programs changes, something that could happen frequently if third-party Web pages are involved? According to Carden, it's a simple matter of re-interrogating the affected objects on the revised page and replacing them in original workflow, using the OpenSpan studio.

While others are trying to do what OpenSpan does, Carden says that the others do it in different ways and that his company's approach is unique. It does not require specific programming knowledge, nor does it require access to the source code of the underlying programs or recompiling those codes.

"I like to say were the 'last mile to SOA,'" Carden said. "We can take a 25-year-old application and make it consume a Web service."

According to Carden, the idea of digging into applications and objects at the operating system level was something that was always able to be done by what he calls "rocket scientists." The only problem, he says, is that it was a time-consuming process and was basically a one-off effort. OpenSpan is a way to productize the process and make it available to non-rocket scientists.

For some companies, integrating applications is critical to performance and agility. Carden tells of one client, a large bank, where workers had to deal with 1,600 applications -- although that's a little extreme. The average is about eight.

OpenSpan is currently riding high with an infusion of venture capital from Sigma Partners and Matrix Partners and an infusion of talent with the addition of four key players who were formerly executives with JBoss.

Microsoft's 'service-enabled' approach is really only sorta-SOA

A new article on Redmondmag.com examines the fascinating topic of Microsoft and SOA. Microsoft's strategy -- pivoting on the strengths of Windows Communications Foundation (WCF) as well as (huh?) applications virtualization -- is different from earlier "embrace and extend" market assault campaigns.

Microsoft's SOA strategy amounts to embrace and contract. They want to allow services-enablement using the newest (read: massive upgrades) Microsoft infrastructure to offer a modest embrace of heterogeneity. However, the high-level rationale for SOA is to make heterogeneity a long-term asset, rather than a liability. You should not have to massively upgrade to .NET Framework 3.0 (and tools, and server platforms, and runtimes) to leverage SOA. As we know, SOA is a style and conceptual computing framework, not a reason to upgrade.

I'm liberally quoted in this article, so you can gather the gist of my observations in it. What is curious is the response Microsoft proffers to any suggestion that Windows Everywhere is not somehow synonymous with SOA.

I'm especially betwixt by the assertion that virtualizing a monolithic Windows application, so that it can be hosted (on Windows servers) and accessed via a browser as a service, allows it to "... be part of an SOA-like strategy."

Sorry, dudes, that doesn't even come close to a SOA-enablement of that application. Now, being able to deconstruct said application into a bevy of independent services that can be easily mixed and matched with others (regardless of origin, runtime, framework or operational standards) gets a wee bit closer to the notion.

How about delivering those legacy Windows monolithic server stack applications as deconstructed loosely coupled services on virtualized Windows runtime instances on a hot-swappable Linux on multicore x86 blade farms? Yeee-ha!

It's also curious to see the rationale for Microsoft's sorta-SOA through the lens of needing to gain "... the ability to rapidly evolve an application because you need to change things in near-real time," said Steve Martin, director of product management for Microsoft's Connected Systems Division, as quoted in the article.

This may be RAD or Agile or even Lean -- but it really isn't quite SOA. What we're thinking on the SOA business value plane is the ability to "rapidly evolve" business processes, to actually not have to muck around with the underlying applications and data (nor necessarily upgrade their environments) but instead extract and extend their value into a higher abstraction where transactions, services, data, metadata, and logic can be agilely related and universally governed.

Not sure if BizTalk Server has gained that feature set yet. Visio? Well, obviously you have not yet upgraded sufficiently, dear reader.

It's plain from the article that Microsoft will define SOA to suit its strengths and sidestep it's weaknesses. At least on that count Microsoft is adhering to the standard industry approach. But the risks for Microsoft as it seeks a higher profile in the core datacenters of the global 2000 of not applying SOA at its higher business value is significant. Better stated, they may win some battles but lose the war.

Already, competitors such as IBM, Oracle and BEA are moving toward offerings that reduce the emphasis on applications (virtualized or not) and instead places the emphasis on business processes maneuverability. And just like with SaaS, it's not about the technology -- but is everything about the business paybacks (top-line and bottom-line) and overall IT risk reduction. The business model of both the vendor and the customer need to match up.

IBM's strategies around reuse
and its focus on creating pools of services that can be applied not generally to IT but specifically to business verticals, industries, and even individual enterprises (think of its as the long tail wags SOA), is ultimately an aligning influence between IBM and its partners and customers. What runs IT beneath is more optional, so you might as well seek the best cost-benefits approach, which will include open source infrastructure, SaaS, and all the other options in the market.

If you save big on business agility, the cost differences of the underlying infrastructure (say between IBM or BEA or Microsoft) is a rounding error -- but you have to deliver business agility.

IT needs to move holistically at the speed of businesses. Microsoft seems to be saying with sorta-SOA that, "Hey, it will cost you less to virtualize your Windows applications so use us, even though you gain less total agility." Does not compute, Will Robinson.

Microsoft's sorta-SOA approach may have short-term benefits for Redmond, but medium- to long-term it divides Microsoft from its customers and prospects and their needed goals for IT. I've said it before, what's good for Microsoft is not necessarily good for its customers, their balance sheets, or their business agility. And only Microsoft can change that.

An enterprise that embraces sorta-SOA from Microsoft only will compete with an enterprise that embraces and extends SOA liberally, openly, leveraging all the options in the global market of labor, information, infrastructure, services, and both business and hosting models flexibility. As inclusive IT agility becomes the primary business enabler globally and specifically, the choices will be fairly stark fairly quickly. The choices will be more stark and more quickly for ISVs, service providers, and telcos.

At some point the savvy IT leaders may ask if dragging Windows Everywhere along for the ride makes sense. Why not just use sorta-SOA to expose all the existing Microsoft stuff, cut bait and move on for the real differentiating IT functionality and productivity?

Will Microsoft's vision of Windows Everywhere "plus services" do better for their clients than enterprises that just move to services everywhere? This is the ultimate question.

What's most likely about Microsoft's current SOA strategy is that it keeps them on the nuts and bolts level of good tools to make good applications and good services -- but at a price. Nowadays, making the services themselves, however, does not hijack the decisions on infrastructure, as it did in the past.

In a loosely coupled, virtualized world -- where hosting and deployment options abound -- there is nothing that locks in the applications as services to any platform. This is true on both the client and server, and more true at the higher abstractions of metadata and business processes.

And, sure, there will be new choke points, such as governance, policy, semantical symmetry -- but none of them for SOAs seem to carry the added tax of necessary, routine, massive upgrades of specific essential components up and down a pre-configured stack to work. Sorta.

The playing field is a bit more level. May the best IT approaches to total business agility win.

Disclosure: I am a paid columnist for Redmond Developer News, a sister publication to Redmond Magazine.

Tuesday, July 31, 2007

Red Hat ramps up virtualization drive for RHEL 5

The good news for virtualization just keeps on coming. Monday, we reported that start-up Desktone had gotten an infusion of venture capital and that Cisco was buying a chunk of VMWare.


Now, Red Hat's Emerging Technologies Team has a blog posting that shows how customers are using virtualization for fun and productivity in Red Hat's Enterprise Linux 5.

The team's blog pushes the idea of "para virtualization," a technology that offers high performance, but doesn't require special processor capabilities. The team feels this will help drive the adoption of virtualization more pervasively, so that para-virtualization will become the default deployment for Linux 4 or 5 applications.

The blog includes a quick-hit bullet list of the main points and, for those with more time and interest, a longer discussion, as well as a case study.

Monday, July 30, 2007

SOA Insights analysts on Web 3.0, Google's role in semantics, and the future of UDDI

Listen to the entire podcast, or read a full transcript of the discussion.

The notion of a world wide web that anticipates a user's needs, and adds a more human touch to mere surfing and searching, has long been a desire and goal. Yet how closer are we to a more "semantic" web? Will such improvements cross over into how enterprises manage semantic data and content?

Our expert panel digs into this and other recent trends in SOA and enterprise IT architecture in the latest BriefingsDirect SOA Insights Edition, volume 17. Our group also examines Adobe's open source moves around Flex, and how UDDI is becoming more about politics than policy.

So join noted IT industry analysts Joe McKendrick, Jim Kobielus, Dave Linthicum and Todd Biske for our latest SOA podcast discussion, hosted and moderated by yours truly.

Here are some excerpts:
I saw one recent article where [the semantic web] was called Web 3.0, and I thought, “Oh, my Lord, we haven’t even decided that we are all in agreement on the notion of Web 2.0.”

[But] there is activity at the World Wide Web Consortium that’s been going on for a few years now to define various underlying standards and specifications, things like OWL and Sparkle and the whole RDF and Ontologies, and so forth.

So, what is the Semantic Web? Well, to a great degree, it refers to some super-magical metadata description and policy layer that can somehow enable universal interoperability on a machine-to-machine basis, etc. It more or less makes the meanings manifest throughout the Web through some self-description capability.

You can look at semantic interoperability as being the global oceanic concern. Wouldn’t be great if every single application, data base, or file that was ever posted by anybody anywhere on the Internet somehow, magically is able to declare its full structure, behavior, and expectations?

Then you can look at semantic interoperability in a well-contained way as being specific to a particular application environment within an intranet or within a B2B environment. ... The whole notion of a "semantic Web," to the extent that we can all agree on a definition, won’t really come to the fore until there is substantial deployment inside of enterprises.

Conceivably, the enterprise information integration (EII) vendors are providing a core piece of infrastructure that could be used to realize this notion of a Semantic Web, a way of harmonizing and providing a logical unified view of heterogeneous data sources.

Red Hat, one of the leading open source players, is very geared to SOA and building an SOA suite. Now, they are acquiring an EII vendor, which itself is very SOA focused. So, you’ve got SOA; you’ve got open source; you’ve got this notion of a semantic layer, and so forth. To me, it’s like, you’ve stirred it all together in the broth here.

That sounds like the beginnings of a Semantic Web that conceivably could be universal or “unversalizable,” because as I said, it’s open source first and foremost.

If we build on this, it does solve a lot of key problems. You end up dealing with universal semantics, how that relates to B2B domains, and how that relates to the enterprise domains.

As I'm deploying and building SOAs out there in my client base, semantic mediation ultimately is a key problem we’re looking to solve.

The average developer is still focused on the functionality of the business solution that they're providing. They know that they may have data in two different formats and they view it in a point-to-point fashion. They do what they have to do to make it work, and then go back to focusing on the functionality, not really seeing the broader semantic issues that come up when you take that approach.

One thing that’s going to happen with the influence of something like Google, which is having a ton of a push in the business right now, is that ultimately these guys are exposing APIs as services. ... They're coming to the realization that the developers that leverage these APIs need to have a shared semantic understanding out on the Web. Once that starts to emerge, you're going to see a push down on the enterprise, if that becomes the de-facto standard that Google is driving.

In fact, they may be in a unique position to create the first semantic clearing house for all these APIs and applications that are out there, and they are certainly willing to participate in that, as long as they can get the hits, and, therefore, get the advertising revenue that’s driving the model.

[Google] is in the API business and they are in the services business. When you're in for a penny, you're in for a pound. ... You start providing access to services, and rudimentary on-demand governance systems to account for the services and test for rogue services, and all those sorts of things. Then you ultimately get into semantics, security, and lots of other different areas they probably didn’t anticipate that they'd get into, but will be pushed into, based on the model they are moving into.

... Perhaps Google or others need to come into the market with a gateway appliance that would allow for policy, privilege, and governance. This would allow certain information from inside the organization that has been indexed in an appliance, say from Google, to then be accessed outside. Who is going to be in the best position to manage that gateway of content on a finely-grained basis? Google.
Listen to the entire podcast, or read the full transcript for more IT analysis and SOA insights. Produced as a courtesy of Interarbor Solutions: analysis, consulting and rich new-media content production.

Sunday, July 29, 2007

TIBCO donates Ajax messaging bus to OpenAjax Alliance

TIBCO Software is easing the way for Ajax component interoperability with the donation this week of its core Ajax message bus technology to the OpenAjax Alliance (OAA) Hub project. TIBCO announced the donation today, at the same time as it released its PageBus, a related open source product.

What's in it for you? Well, besides the technological benefits, developers could walk away with a 50-inch plasma TV or a 30-GB iPod, if they enter -- and win -- the Ultimate Mashup Ajax Challenge.

PageBus applies "publish and subscribe" message bus programming patterns within the context of a single Web page, allowing communication among multiple Ajax components. This allows developers to create composite applications from reusable parts and services. All of this is designed to reduce development costs, improve interfaces over HTML and increase business agility.

The message-bus approach solves one of the key problems that comes from combining increasingly sophisticated composite applications. As the number of composite applications and mashups increase, the programming -- and needed event-driven reliability -- required can increase exponentially.

What's more creating client-SOA applications is easier because the same conceptual architecture -- publish and subscribe -- is used for both rich Internet client (RIA) activities as well as for compositing backend services. TIBCO says it has large banks and other users delivering mission critical, real-time data through SOA backends to scads of Ajax-enabled components on RIA clients.

Users get a quick, rich experience, while developers and architects gain flexibility and speed-to-deployment. TIBCO gains by riding the wave of increased demand for back-end SOA integration and messaging infrastructure to support the RIA ramp-up.

TIBCO, as a member of the OAA, is working with more than 70 companies to standardize key aspects of Ajax. The OpenAjaxHub 1.0, the group's first specification implementation, aims to provide Ajax interoperability through the publish/subscribe interface. The specification will formally be out in about six weeks, but the code is now at Sourceforge.net.

PageBus is open source and can be downloaded. It's also shipped as part of the TIBCO Ajax Message Service.

The above-noted mashup challenge is a developer community project to build the world's largest mashup using PageBus and TIBCO's General Interface. The contest runs through September 30, after which TIBCO and co-sponsor Artima will award prizes for the best entries.

'Desktop as a service' coming soon to a PC near you

Desktop virtualization as a service has gotten a big boost with Desktone Inc.'s announcement today of $17 million in venture capital funds that will allow the company to finance global sales, marketing, and development.

Desktone, a Chelmsford, Mass., startup offers what it says is the industry's first unified virtual desktop platform. I call it "desktop as a service." The company's turnkey solution provides enterprises an on-ramp to virtual desktop computing that integrates all virtualization layers -- storage, applications, client devices, servers, processing, and network technologies -- allowing them to be controlled from a single management console.

The ability to deliver a PC operating environment in a way users are accustomed to via grid/utility efficiencies in a way that appeals to the realities of enterprise IT departments and needs may be a seed that has a long way to grow. But compelling economics and the movement generally to services delivery portends a fast-growing new market segment for home, SMB and large business users. Telcos and cable providers will need to provide these kinds of services, for sure.

The first round of Desktone financing was co-led by Highland Capital Partners and SoftBank Capital. It also included Citrix Systems, Inc., a leader in application delivery infrastructure, and China-based Tangee International.

Desktone's management lineup boasts a collection of industry heavy hitters, including CEO Harry Ruda, formerly CEO of Softricity, which was acquired by Microsoft in 2006; COO Paul Gaffney, formerly CIO of Staples and a senior VP at Schwab Technology; CFO Michael Hentschel, who hails from Resilience, Avaterra, and TechVest Ventures; CTO Clint Battersby, who has over 15 years experience in LAN, WAN, Internet and storage acceleration; and executive VP Scott Andersen, former CEO of GlobalServe and Exist.

Ron Fisher of SoftBank Capital said his comnpany's interest in Desktone's technology was spurred by the growing need to manage large-scale PC and server environments. "Enterprises have a tremendous need to accelerate desktop virtualization adoption in a way that is cost-effective and simple, and doesn't drain IT resources," Fisher said.

The Desktone announcement comes on the heels of the news that Cisco Systems has plans to invest $150 million in virtualization giant VMWare, Inc. That news was accompanied by an announcement of a collaboration agreement between the two around joint development, marketing, customer and industry initiatives.

The stars and planets finally appear to be aligning in a way that makes utility-oriented delivery of a full slate of client-side computing and resources an alternative worth serious consideration. As more organizations are set up as service bureaus -- due to such IT industry developments as ITIL and shared services -- the advent of off the wire everything seems more likely in many more places.

Of course, there are those who will maintain that "software plus services" is the only best bet. The competition of the more fully virtualized desktop approaches like Desktone with the heavy, PC-based OS legacy can only be good for the IT use market -- and overall productivity.