Monday, May 18, 2009

TIBCO Spotfire 3.0's features bring strategists closer to real-time human-from-data decision-making

If the last six months have proven anything to business strategists, it's that corporate agility is not just a "nice to have." Being able to adjust massively complex businesses at the drop of a market index is clearly imperative.

But just how to act when the signs point to the need for rapid adjustment? Quality -- not necessarily quantity -- determines the winning response to unanticipated market and economic shifts.

So TIBCO Software's release today of Spotfire 3.0, the visualization analytics solution, comes at a great time. The platform's new features are designed to significantly improve integration of the structured data sets to be analyzed and viewed, improve how developers build analytics applications, and scales in terms of volume and speed to the demands of global companies. [Disclosure: TIBCO is a sponsor of BriefingsDirect podcasts.]

Spotfire 3.0 lets uses expand Spotfire applications into additional business areas, and also allows new classes of users to tap the Spotfire data visualization experience, says the company. The integration benefits include simplified connectivity to SAP, Oracle, Siebel, and Salesforce.com business applications data. Spotfire 3.0 works in tandem with TIBCO Spotfire Application Data Services to bring the data assets from these business applications into the visualization and distribution process.

The types of data views Spotfire produces augment, but don't replace traditional business intelligence (BI) values. Furthermore, these easily customized data visualization applications can be used by many kinds of workers -- or via the web by customers and partners -- whereas BI usually requires the intermediaries of seasoned SQL or other query tools analysts. You'll need and want to be able to do both BI and ad hoc data visualizations.

More and better data put into easily and quickly accessed and understood produces a value that has never been more important. Quick and ubiquitous access to the fruits of data assimilation and analysis (with proper enterprise-class security and access control) not only helps companies and leaders make good decisions, it helps validate and adjust those decisions in near real-time. Nowadays, it's not enough to have a good bead on a strategy or shift, you need to have the convincing data available to prove and re-prove the actions and strategy. And then repeat.

The latest Spotfire release comes on the heels of last year's improvements in mashups support, real-time data and business process integration, new visualization methods and predictive analytics. These have helped companies leverage their investments in complex event processing (CEP) capabilities and enterprise service buses (ESBs). I wouldn't be surprised to see some ability to leverage the Spotfire analytics in the context of business process modeling (BPM) at some point in the future.

So far the visualization benefits of Spotfire apply to structured data, but bringing a richer mix into the visualization landscape can be done via third parties and various data and content assimilation methods. Bringing more content into the process will, of course, grown more important over time, especially as we enter the cloud era -- with valued data and information available from more sources in more formats.

Indeed, the newest Spotfire includes a Web services connector to tap many additional applications and data sources. "An integrated caching layer also dramatically speeds up data access from slow data sources by pre-loading common views and eliminating or reducing the need to create data warehouses or data marts," says TIBCO.

TIBCO Spotfire 3.0 is available now. For more information http://spotfire.tibco.com/Products/Whatsnew-Spotfire.aspx.

Wednesday, May 6, 2009

Compuware refocuses: optimization, performance, portfolio management in -- Quality out

This guest post comes courtesy of David A. Kelly at Upside Research, where he’s principle analyst. You can reach him here.

Well, okay, maybe that headline is misleading, but the details aren’t.

Detroit-based software giant Compuware isn’t really dropping the quality of its products, but it is selling off its Quality Solutions product line to help refocus its business on areas where it can compete most effectively.

On Wednesday Compuware announced an agreement that Micro Focus would acquire Compuware’s Quality Solutions line, including the products themselves as well as the 330 people in the development, sales, and customer-support teams. The deal is valued at $80 million and expected to close this quarter.

MicroFocus is also buying Borland Software for $67 million, placing Micro Focus more powerfully in the applications quality and lifecycle management arena. [Disclosure: Borland is a sponsor of BriefingsDirect podcasts.]

Compuware has never been a company that moves fast — but for them, and their customers, that’s been a good thing. For years, Compuware has been a reliable, steady and practical IT partner for governments, mainframe-oriented IT shops, and large organizations.

But this announcement, which Compuware portrays as another step in its “Compuware 2.0 evolution,” is expected to allow Compuware to invest resources and energy in what it sees as high-opportunity markets, from application performance and mainframe optimization to IT portfolio management and healthcare collaboration.

Perhaps another way to read this is that while Obama’s stimulus package has the potential to jack up the need for new technologies, modernization of healthcare and other government IT environments, it doesn’t necessarily mean that companies will be spending significantly more on code testing or development tools.

With Micro Focus acquiring Borland the emphasis goes deeply to application lifecycle management (ALM). Of course, more recently, Borland had spun off its traditional developer tools group into CodeGear (sold last year to Embarcadero Technologies), and had refocused on Open ALM, or ALM 2.0.

Incidentally, Former Borland CEO Todd Nielsen is now a poobah at VMware.

Micro Focus hopes that by acquiring complementary technologies from Borland and Compuware that it will be able to create a market-leading position in the application testing/automated software quality market. Such a position would work well to broaden Micro Focus’s leadership in the application management and modernization business.

And although this move makes some sense from Compuware’s perspective, don’t kid yourself that quality or good old testing is dead—it isn’t. And even though the next five years will no doubt see a big inflection point between traditional, workstation-oriented development products and processes and cloud-based ones, there are still plenty of applications and organizations that can benefit from solid application quality solutions.

Longer term, however, the real winner that market will be the company (perhaps Micro Focus?) that’s able to deliver forward-looking (i.e., cloud-oriented) technologies that span these IT needs and deliver practical solutions to increasing software and application quality.

This guest post comes courtesy of David A. Kelly at Upside Research, where he’s principle analyst. You can reach him here.

Follow me on Twitter at http://twitter.com/Dana_Gardner.

WSO2 moves data services component to OSGI-based Carbon framework

Moving to expand its user base to more database folks, WSO2 is releasing the promised data services component to Carbon, the open source company’s new modular service-oriented architecture (SOA) framework based on the OSGi component model.

WSO2 Data Services is “completely re-architected” for Carbon’s componentized approach to SOA development, which WSO2 debuted earlier this year. [Disclosure: WSO2 is a sponsor of BriefingsDirect podcasts.]

The new data services tools are aimed at database programmers and database administrators (DBAs), folks who may not be as familiar with WS-* style Web services, REST-style Web resources, data services, or OSGi as their Java coding brethren.

To help ease database folks into the brave new world of data services, WSO2 is offering free online training courses this month to “explain data services concepts and best practices for quickly exposing data as Web services.” In order to promote new thinking about enterprise data applications in the midst of a recession, WSO2 said it is waiving the $199 fee for the courses.

“WSO2 Data Services addresses the demand among enterprises to quickly and easily take data from a wide variety of sources and expose it as Web services within their SOAs,” Dr. Sanjiva Weerawarana, founder and CEO of WSO2, said in announcing the product.

DBAs may be asking: “How easy is easy?”

WSO2 answers that anyone who knows SQL can quickly create data services that can be shared and accessed across the network.

And you can even do some data service management from – we are not making this up –your cell phone.

This feature is courtesy of Data Services 2.0’s new extensible server administration framework that allows customization including writing a bridge application for management of data services servers from a Blackberry or other mobile device.

Since almost no enterprise SOA application is going to have just a single database, the WSO2 product supports a range of data sources. It works with relational databases including Oracle, MySQL and IBM DB2, as well as “virtually any database accessible via JDBC.” It can also work with the good old comma-separated values (CSV) file format, and Excel spreadsheets.

For DBAs and others with security concerns about where all this disparate data is coming from and where it’s going, WSO2 says services can be authenticated, encrypted and/or signed using the WS-Security and HTTP security standards. There is also a WS-Policy Editor for configuring services, as well as support for WS-ReliableMessaging.

Event-driven architecture (EDA) aficionados will find Data Services 2.0 support for events, including graphical declaration of event sources and mediation for event delivery.

Rich Seeley provided research and editorial assistance to BriefingsDirect on this blog. He can be reached richseeley@aol.com.

Follow me on Twitter at http://twitter.com/Dana_Gardner.

Active Endpoints' new ActiveVOS 6.2 offers ‘MultiSite’ BPM capabilities

Run business process management (BPM) applications in data centers anywhere on the planet, scale up, scale down as your business needs change, and never worry about losing a process if a server or an entire location goes down.

This is the market Active Endpoints is aiming at with ActiveVOS 6.2, a new release of its visual orchestration systems (VOS) tools. [Disclosure: Active Endpoints is a charter sponsor of BriefingsDirect podcasts.]

The business process management suite (BPMS) featuring ActiveVOS MultiSite allows users to extend the processing of BPM applications across multiple, geographically separated data centers, according to the Active Endpoints announcement.

At first this might seem like ActiveVOS is trying to ride the cloud hype cycle. But Alex Neihaus, vice president marketing for Active Endpoints, is skeptical of the cloud and even Platform as a Service (PaaS), especially when it comes to BPM.

He notes that similar concepts in the past have had their share of failures, such as Network Storage, as well as the successes, such as Salesforce.com. And he has doubts whether enterprises “will outsource the core business processes inherent in BPM applications.”

IBM will be testing the hypothesis with some new offerings on the modeling side, though it's clear they like the idea of management of processes -- and even governance -- having a place in the cloud, on-premises, and probably both.

ActiveVOS MultiSite is designed to protect “crucial, long-running business processes from interruption or termination due to a major hardware or site failure,” but Neihaus believes its users will want to know exactly where their applications are running, even if the data center is on the other side of the globe.

"With our new release, customers can create a reliable environment to run these core apps," Neihaus told BriefingsDirect. "I wouldn’t link the ActiveVOS 6.2 failover and load-balancing capabilities to PaaS as much as I would to the fact that it’s the first BPMS to deliver geographic independence for BPM applications."

If you follow this market and think this release sounds like “deja vu all over again,” as Yogi used to say, you are partially right.

This week’s ActiveVOS 6.2 comes just 60 days after ActiveVOS 6.1, which was announced in mid-March, Neihaus acknowledges. But he positions this quick succession of point releases as an example of the speed of innovation at Active Endpoints.

Users can download a free, 30-day, fully-supported trial of ActiveVOS 6.2. During the trial, users can take advantage of email-based support as well as training, education and samples available on Active Endpoints’ websites.

Rich Seeley provided research and editorial assistance to BriefingsDirect on this blog. He can be reached richseeley@aol.com.

Follow me on Twitter at http://twitter.com/Dana_Gardner.

Monday, May 4, 2009

IBM cements cloud, appliance, BPM, CEP and SOA into an IMPACT 2009 solution brick

LAS VEGAS -- Wasting no time in bringing a needed cohesion across its products and solutions, IBM on Monday at its IMPACT 2009 event here unveiled a cloud-based business process modeling (BPM) service, tighter alignment with Amazon, better complex event processing (CEP) integration, re-introduced a WebSphere private cloud appliance and double-downed on a slew of its industry framework solutions.

Under the umbrella of spurring on a smarter planet, the IBM push combines many of Big Blue's strengths with the goal of taking out complexity and cutting costs as its customers seek much greater business efficiency in a recession-wracked world. The moves also further IBM's embrace and commitment to services oriented architecture (SOA), but spread its benefits both deeper and wider than the earlier infrastructure push alone.

In a nutshell, IBM is helping enterprises create private clouds as either appliances or built on Z Series mainframes, with better connections to CEP, and managed from BPM in public cloud. It provides an excellent story for IBM, and places it at an early competitive advantage against Microsoft, Oracle/Sun, and HP in the ramp up to SOA-enabled hybrid cloud approaches that tackle tough business problems. IBM is going to the cloud with collaboration, too.

The pizza box-size WebSphere CloudBurst appliance, announced only recently, had its coming out party at today's keynote session, moderated by a hilarious Billy Crystal. See Twitter #IBMIMPACT by searching on the tag in Twitter for more on the live event.

This appliance approach to private clouds will be a big trend in the industry, with Oracle (using acquired Sun technology), HP and perhaps Cisco sure to follow. One has t wonder how Microsoft does appliances, with one or some partners? Will be curious to watch. [Disclosure: HP is a sponsor of BriefingsDirect podcasts.]

To me, though, the biggest news of IMPACT is the move of IBM to provide its own BPM cloud services, called BPM BlueWorks, beginning in Q2 this year. IBM continues to be chummy with Amazon Web Services, and there's no reason to believe that Google will also be an IBM cloud partner.

Indeed, the shift of BPM to a separate, elevated, cloud-based service makes sense because many services and processes will increasingly come from a variety of sources and source types. Allowing the business process and workflow architects to design, manage and implement extended business processes as a cloud service allows for leverage of more services by more businesses, with control and ability to cut costs and reduce complexity.

What's more, if BPM goes in the cloud, then it takes only a small step for IT and SOA governance to stay in the cloud, too. Will SOA, CEP and extended enterprises business processes come together better as a cloud-based management and governance model takes place? Could be.

The only rub is that IBM or some other cloud provider is host to your core control centers. But if enterprises grow comfortable with more IT functions and assets in a third-party cloud, well then the model way well offer a lot of advantages. Of course, the BPM, SOA and governance controls will also likely become hybrids.

IBM and others, like Microsoft, Oracle and HP, will also want to be in the managed management business, so the competition to do this well and right will be intense. And that will be good for users and probably (hopefully) keep the options, standards and portability largely open.

But users should still look out, as with any cloud services, for lock-in and seek contracts that protect their assets and business property. And I'd say that the governance and process models that dictate how your business works should always be considered an enterprise's property. The cloud provider needs to be a value-added provider, not a Big Brother.

IBM is also pumping up its industry frameworks solutions of applications and expertise for retail, traffic management, and health care. Look for these too to emerge as cloud-based hybrid solutions over time. The goal, of source, is to make IBM the total supplier on these vertical industry solutions, with cost and convenience being the drivers on how they are implemented. IBM has done quite well by this so far, and the cloud moves will help it further.

IBM in the cloud in a lot of ways is a very smart move. Getting BPM there first -- in the middle of processes, solutions, and moving to governance -- will be hard to resist for users and tough to beat by competitors.

Follow me on Twitter at http://twitter.com/Dana_Gardner.