Monday, May 4, 2009

IBM cements cloud, appliance, BPM, CEP and SOA into an IMPACT 2009 solution brick

LAS VEGAS -- Wasting no time in bringing a needed cohesion across its products and solutions, IBM on Monday at its IMPACT 2009 event here unveiled a cloud-based business process modeling (BPM) service, tighter alignment with Amazon, better complex event processing (CEP) integration, re-introduced a WebSphere private cloud appliance and double-downed on a slew of its industry framework solutions.

Under the umbrella of spurring on a smarter planet, the IBM push combines many of Big Blue's strengths with the goal of taking out complexity and cutting costs as its customers seek much greater business efficiency in a recession-wracked world. The moves also further IBM's embrace and commitment to services oriented architecture (SOA), but spread its benefits both deeper and wider than the earlier infrastructure push alone.

In a nutshell, IBM is helping enterprises create private clouds as either appliances or built on Z Series mainframes, with better connections to CEP, and managed from BPM in public cloud. It provides an excellent story for IBM, and places it at an early competitive advantage against Microsoft, Oracle/Sun, and HP in the ramp up to SOA-enabled hybrid cloud approaches that tackle tough business problems. IBM is going to the cloud with collaboration, too.

The pizza box-size WebSphere CloudBurst appliance, announced only recently, had its coming out party at today's keynote session, moderated by a hilarious Billy Crystal. See Twitter #IBMIMPACT by searching on the tag in Twitter for more on the live event.

This appliance approach to private clouds will be a big trend in the industry, with Oracle (using acquired Sun technology), HP and perhaps Cisco sure to follow. One has t wonder how Microsoft does appliances, with one or some partners? Will be curious to watch. [Disclosure: HP is a sponsor of BriefingsDirect podcasts.]

To me, though, the biggest news of IMPACT is the move of IBM to provide its own BPM cloud services, called BPM BlueWorks, beginning in Q2 this year. IBM continues to be chummy with Amazon Web Services, and there's no reason to believe that Google will also be an IBM cloud partner.

Indeed, the shift of BPM to a separate, elevated, cloud-based service makes sense because many services and processes will increasingly come from a variety of sources and source types. Allowing the business process and workflow architects to design, manage and implement extended business processes as a cloud service allows for leverage of more services by more businesses, with control and ability to cut costs and reduce complexity.

What's more, if BPM goes in the cloud, then it takes only a small step for IT and SOA governance to stay in the cloud, too. Will SOA, CEP and extended enterprises business processes come together better as a cloud-based management and governance model takes place? Could be.

The only rub is that IBM or some other cloud provider is host to your core control centers. But if enterprises grow comfortable with more IT functions and assets in a third-party cloud, well then the model way well offer a lot of advantages. Of course, the BPM, SOA and governance controls will also likely become hybrids.

IBM and others, like Microsoft, Oracle and HP, will also want to be in the managed management business, so the competition to do this well and right will be intense. And that will be good for users and probably (hopefully) keep the options, standards and portability largely open.

But users should still look out, as with any cloud services, for lock-in and seek contracts that protect their assets and business property. And I'd say that the governance and process models that dictate how your business works should always be considered an enterprise's property. The cloud provider needs to be a value-added provider, not a Big Brother.

IBM is also pumping up its industry frameworks solutions of applications and expertise for retail, traffic management, and health care. Look for these too to emerge as cloud-based hybrid solutions over time. The goal, of source, is to make IBM the total supplier on these vertical industry solutions, with cost and convenience being the drivers on how they are implemented. IBM has done quite well by this so far, and the cloud moves will help it further.

IBM in the cloud in a lot of ways is a very smart move. Getting BPM there first -- in the middle of processes, solutions, and moving to governance -- will be hard to resist for users and tough to beat by competitors.

Follow me on Twitter at http://twitter.com/Dana_Gardner.

Sunday, May 3, 2009

BriefingsDirect analysts unpack PaaS and predict future impact on enterprises and developers

Listen to the podcast. Download the podcast. Find it on iTunes and Podcast.com. Charter Sponsor: Active Endpoints. Sponsor: TIBCO Software.

Read a full transcript
of the discussion.

Special offer: Download a free, supported 30-day trial of Active Endpoint's ActiveVOS at www.activevos.com/insight.

People talk about “The Cloud” as if it is one unified platform, but it’s the exact opposite, argues Jim Kobielus, senior analyst at Forrester Research.

Misconceptions of what “The Cloud” is and a general lack of standardization among the vendors offering cloud computing services is bound to be confusing to organizations contemplating moving enterprise applications out onto the Internet.

In a podcast featuring a panel industry thought leaders, moderated by yours' truly, we offer new insight into the current status of cloud offerings and the future need for open standards and governance. Who is using the cloud for what -- and where this trend is going -- are discussed as the podcast panelists unpack the Platform as a Service (PaaS) concept in BriefingsDirect Analyst Insights Edition, Volume 40.

But before everyone jumps on the cloud bandwagon, they need to know what they are getting into, and Kobielus warns of dangers ahead if the cloud vendors end up returning to the era of silos. With each vendor creating their own proprietary version, the cloud could transform service-oriented architecture (SOA) into “silo or stealth pipe” architecture.

“The current state of cloud computing goes against the grain of SOA, where SOA is all about platform agnosticity and being able to port services flexibly and transparently from one operating platform to another,” Kobielus argues.

"This a real challenge for Microsoft. It's like the open systems discussion we had a little while ago," says David A. Kelly, president of Upside Research. "It makes more sense for players that actually earn their revenue in a different form than traditional operators, because someone like Amazon has a core business.

"Someone like Microsoft is kind of painted into the corner at the moment. That's a challenge not just for Microsoft, but for other traditional vendors. They can expand into this new area by offering low-cost services that take away from competitors, but don't hurt their core business," said Kelly.

But the cloud may simply not yet be at the stage of maturity where vendors can all get together and sing "Kumbaya." Jonathan Bryce, co-founder of Mosso, a cloud services provider at Rackspace, says vendors and providers are still getting their acts together.

“We are still developing what our niche is going to be,” he explained. “So there hasn't been a lot time to kind of stick our heads up and say, 'Oh, okay, this is what they are doing, and this is what we are doing, and it makes sense for us to tie these together'.”

That doesn’t mean everyone is afraid to try out cloud computing.

Developers, always a curious and adventuresome bunch, are already flying off into the clouds that provide them with easy access to compute power in the PaaS mode.

The cloud can set coders free, says Rourke McNamara, product marketing director at TIBCO Software, in making the positive case for PaaS. [Disclosure: TIBCO is a sponsor of BriefingsDirect podcasts.]

“It frees them from having to worry about a bunch of details that have nothing to do with their core business, and the application they are writing,” McNamara explains. “It frees them from having to install platform software on a bunch of machines, putting those machines into racks, connecting them up to the management and monitoring infrastructure, from getting everything set-up, so that those machines are fault-tolerant and the loads distributing appropriately, from making sure that they have got the right machines to handle load, and making sure that they are predicting load increases and capacity increase or requirement increases, and far enough advance, but they are able to buy new machines.”

Beyond just being a developers’ playground, the panelists see the cloud as eventually hosting Web-based Business Intelligence (BI), data mart, data mining and outward-facing B2B and B2C applications.

“You don't make your money by selling your own bellybutton,” quips Michael Meehan, senior analyst with Current Analysis. “You make your money by going out and interacting with the rest of the world, and so those are where the opportunities are.”

Meehan suggests that the governance or “adult supervision” needed for the cloud can build on what has already been done with SOA. He sees the cloud has the extension of service-orientation. He argues that not only is SOA not dead but the past decade of work on industry standards for services will allow organizations to take advantage of PaaS for business applications.

“I don't think you can move out to the cloud unless you are essentially service-oriented,” Meehan said. “I don't think the one exists without the other.”

So join our guests and analysts as they to dig into the enterprise role of PaaS.

Listen to the podcast. Download the podcast. Find it on iTunes and Podcast.com. Charter Sponsor: Active Endpoints. Sponsor: TIBCO Software.

Read a full transcript
of the discussion.

Special offer: Download a free, supported 30-day trial of Active Endpoint's ActiveVOS at www.activevos.com/insight.

Friday, May 1, 2009

New Open Group SOA book builds bridge over delta between IT and business services

Business people discovered services around the time the first cave man offered to start campfires in exchange for food.

IT people groked service orientation sometime in the past decade but are still struggling to communicate their discovery to business people.

This may be an exaggeration, but it also helps explain the disconnect between business and IT that has plagued adoption of service-oriented architecture (SOA) to the point where some people have thrown up their hands and declared SOA dead.

In some ways the problem of getting business people to embrace SOA is due to this backward-incompatible approach, in the view of Chris Harding, forum director for The Open Group. [Disclosure: The Open Group is a sponsor of BriefingsDirect podcasts.]

“One of the things we are supposed to do is bring about alignment between the business and technical communities,” he said. “What we found when we were doing that is the business people have known what a service was for centuries if not millennia.

"And technical people have come across this wonderful new idea. And actually the alignment problem is to stop the technical people reinventing service in a new way that the business people don’t understand.”

To help get the business-technical alignment back on track, The Open Group is publishing The SOA Source Book.

Harding knows what you are thinking: What do we need with another SOA book?

He is quick to differentiate that The SOA Source Book is from all the other SOA titles now available.

To begin with this is not your coder’s SOA book. It does not tell you how to build a service. It is also not a publication of standards and guidelines that would have required a lengthy review and adoption process.

The SOA Source Book was created by members of The Open Group’s SOA Workgroup, who have day jobs architecting business applications. They are offering real world enterprise architecture experience in deploying services for business purposes.

Applying The Open Group Architecture Framework (TOGAF) approach, The SOA Source Book “aspires to be systematized common sense” in architecture and governance, Harding says.

It takes a flexible approach to implementation. For example, rather than advocating one model for SOA, the book suggests a number of models that can be used depending on what makes the most common sense for the business application.

The SOA Source Book is also not your after-market weighty tome that can double as a doorstop. Running exactly 100-pages in the PDF version, it features short clear sentences in brief paragraphs focused on the many moving parts of an SOA implementation. Scanning the categories and subheads in the table of contents, the reader can quickly find information on a specific subject.

Rather than reading it from cover to cover, Harding anticipates that enterprise and IT architects will use it to quickly look up information they need for specific components or processes they are working on.

The SOA Source Book is available in both printed and electronic form (if you want to save a tree). More information is available.

Rich Seeley provided research and editorial assistance to BriefingsDirect on this blog. He can be reached at Writer4Hire.

Follow me on Twitter at http://twitter.com/Dana_Gardner.

rPath offers free management tool for applications aspiring to the cloud

rPath would like to be your applications' path to cloud computing.

The Raleigh, N.C.-based start up founded by Red Hat refugees, Tim Buckley,
executive chairman of the board, and Erik Troan, CTO, recently released a free downloadable version of rBuilder for managing application deployment to virtual or cloud-based environments as well as traditional glass houses. [Disclosure: rPath is a sponsor of BriefingsDirect podcasts.]

For IT managers looking at cloud deployment, rPath’s approach is to embrace as many flavors of the cloud as possible to deal with the fact that what is commonly called the cloud is really a bunch of non-standard environments varying from vendor to vendor.

rPath lists support for three clouds, Amazon EC2, Globus Alliance, and Bluelock. rBuilder also supports hypervisors, including VMware ESX, Citrix Xen and Microsoft Hyper-V.

As a startup with a limited budget for hardware, rPath eats its own cloud dog food. The company uses Amazon EC2 for some of its own applications, as Billy Marshall, chief strategy officer, explained in a Q&A interview with SearchSOA last fall. We also did an interiew with Marchall on BriefingsDirect.

The new free version of rBuilder differs from the free rBuilder Online community version in that you can download it and run it behind your own firewall. And it differs from the commercial version in that it is restricted to 20 running system instances in production.

Once a user reaches 21, they have to “establish a commercial relationship with rPath.”

Also, users of the free version can only get support through the rBuilder Online community.

For shops looking to explore Cloud computing, the free version of rBuilder, appears to be a viable option. You can check out the system requirements and download instruction at rPathQuickStart.

Rich Seeley provided research and editorial assistance to BriefingsDirect on this blog. He can be reached at Writer4Hire.

Follow me on Twitter at http://twitter.com/Dana_Gardner.

Wednesday, April 29, 2009

PC 'security as a service' gains global cloud footprint with free Panda anti-virus offering

Cloud computing's utility and power in everyday life reached a notable new milestone today with Panda Security's free PC security service.

This delivery and two-way malware detection-access model makes a ton of sense, so much so that I expect we'll be soon seeing the cloud model deliver of more than PC security and anti-virus/anti-spam services. The era of remote services for a slew of device support and maintenance -- of everything from cars to cell phones to home appliances -- is upon us.

Essentially anything that uses software and has network access can be supported efficiently and powerfully based on the Panda Security cloud model. Making the service free to home-based users is especially brilliant because it gains the Metcalfe's Law benefits of a valuable community to detect the malware, with the means to then sell the detection and prevention means to business and professional users. [Disclosure: Panda Security is a sponsor of BriefingsDirect podcasts.]

Here's how it works, from Panda's release:
Consumers can download the free client protection product from http://www.cloudantivirus.com. ... The Panda Cloud Antivirus thin-client agent introduces a new philosophy for on-access asynchronous cloud-scanning. It combines local detection technologies with real-time cloud-scanning to maximize protection while minimizing resource consumption. This optimized model blocks malicious programs as they attempt to execute, while managing less dangerous operations via non-intrusive background scans.

Panda's proprietary cloud computing technology called Collective Intelligence, Panda Cloud Antivirus harnesses the knowledge of Panda's global community of millions of users to automatically identify and classify new malware strains in almost real-time. Each new file received by Collective Intelligence is automatically classified in under six minutes. Collective Intelligence servers automatically receive and classify over 50,000 new samples every day. In addition, Panda's Collective Intelligence system correlates malware information data collected from each PC to continually improve protection for the community of users.
Panda says the model demands a lot less of a PC's resources, 5% versus 9% for other fat-client AV software approaches. That means older PCs can get protected better, cheaper, and longer. Far fewer people will need to upgrade the PC hardware just to keep it free from viruses. It's about time! Poor security should not be a business model for sellers of new computers and software.

I'm going to try this service on Windows XP Home running on Parallels on my iMac Leopard. I'll report back on how it works.

As I said, I hope this model succeeds because it really is a harbinger of how cloud-based services can improve and solve thorny problems in a highly efficient manner that combines the power of community with scale and automation. This may go far in also dissuading the creators of malware because the bad things will be squelched so fast if a Panda model get critical mass that the effort is useless and therefore mute.

Panda Security, a privately held company based in Spain, could well see its services expand to include PC maintenance, support, remote and automated support, and even more SaaS applications and productivity services. I expect this burgeoning ball of PC services from the cloud ecology to become the real software plus services model. It will be very interesting to see which vendors and/or providers or partnerships can assemble the best solutions package first and best.

Incidentally, don't expect Microsoft to do this cloud-based security thing. It can't afford to kill off or alienate the third-party malware security providers by doing it all itself. Those days are long past gone. The third parties, however, can now stretch their wings and fly. And they are.

Follow me on Twitter at http://twitter.com/Dana_Gardner.

Tuesday, April 28, 2009

Can software development aspire to the cloud?

This guest post comes courtesy of Tony Baer's OnStrategies blog. Tony is a senior analyst at Ovum. His profile is here. You can reach him here.

As we’re all too aware, the tech field has always been all too susceptible to the fad of the buzzword, which of curse gave birth to another buzzword as popularized by gave birth to Gartner’s Hype Cycles. But in essence the tech field is no different from the worlds of fashion or the latest wave in electronic gizmos – there’s always going to be some new gimmick on the block.

But when it comes to cloud, we’re just as guilty as the next would-be prognosticator as it figured into several of our top predictions for 2009. In a year of batten-down-the-hatch psychology, anything that saves or postpones costs, avoids long-term commitment, while preserving all options (to scale up or ramp down) is going to be quite popular, and under certain scenarios, cloud services support all that.

And so it shouldn’t be surprising that roughly a decade after Salesforce.com re-popularized the concept (remember, today’s cloud is yesterday’s time-sharing), the cloud is beginning to shake up how software developers approach application development. But in studying the extent to which the cloud has impacted software development for our day job at Ovum, we came across some interesting findings that in some cases had their share of surprises.

ALM vendors, like their counterparts on the applications side, are still figuring out how the cloud will impact their business. While there is no shortage of hosted tools addressing different tasks in the software development lifecycle (SDLC), major players such as IBM/Rational have yet to show their cards. In fact, there was a huge gulf of difference in cloud-readiness between IBM and HP, whose former Mercury unit has been offering hosted performance testing capabilities for 7 or 8 years, and is steadily expanding hosted offerings to much of the rest of its BTO software portfolio.

More surprising was the difficulty of defining what Platform-as-a-Service (PaaS) actually means. There is the popular definition and then the purist one. For instance, cloud service providers such as Salesforce.com employ the term PaaS liberally in promoting their Force.com development platform, in actuality development for the Force.com platform uses coding tools that don’t run on Salesforce’s servers, but locally on the developer’s own machines. Only once the code is compiled is it migrated to the developer’s Force.com sandbox where it is tested and staged prior to deployment. For now, the same principle applies to Microsoft Azure.

That throws plenty of ambiguity on the term PaaS – does it refer to development inside the cloud, or development of apps that run in the cloud? The distinction is important, not only to resolve marketplace confusion and realistically manage developer expectations, but also to highlight the reality that apps designed for running inside a SaaS provider’s cloud are going to be architecturally different than those deployed locally. Using the Salesforce definition of PaaS, apps that run in its cloud are designed based on the fact that the Salesforce engine handles all the underlying plumbing. In this case, it also highlights the very design of Salesforce’s Apex programming language, which is essentially a stored procedures variant of Java. It’s a style of development popular from the early days of client/server, where the design pattern of embedding logic inside the database was viewed as a realistic workaround to the bottlenecks of code running from fat clients. Significantly, it runs against common design patterns for highly distributed applications, and of course against the principles of SOA, which was to loosely couple the logic and abstracted from the physical implementation. In plain English, this means that developers of apps to run in the cloud may have to make some very stark architectural choices.

The confusion over PaaS could be viewed as a battle over vendor lock-in. It would be difficult to port an application running in the

That throws plenty of ambiguity on the term PaaS – does it refer to development inside the cloud, or development of apps that run in the cloud?

Salesforce cloud to another cloud provider or transition it to on premises because the logic is tightly coupled to Salesforce’s plumbing. This also sets the stage for future differentiation of players like Microsoft, whose Software + Services is supposed to make the transition between cloud and on premises seamless; in actuality, that will prove more difficult unless the applications are written in strict, loosely-coupled service-oriented manner. But that’s another discussion that applies to all cloud software, not just ALM tools.

But the flipside of this issue is that there are very good reasons why much of what passes for PaaS involves on-premises development. And that in turn provides keen insights as to which SDLC tasks work best in the cloud and which do not.

The main don’ts consist of anything having to do with source code, for two reasons: Network latency and IP protection. The first one is obvious: who wants to write a line of code and wait until it gets registered into the system, only to find out that the server or network connection went down and you’d better retype your code again. Imagine how aggravating that would be with highly complex logic; obviously no developer, sane or otherwise, would have such patience. And ditto for code check-in/check out, or for running the usual array of static checks and debugs. Developers have enough things to worry about without having to wait for the network to respond.

More of concern however is the issue of IP protection: while your program is in source code and not yet compiled or obfuscated, anybody can get to it. The code is naked, it’s in a language that any determined hacker can intercept. Now consider that unless you’re automating a lowly task like queuing up a line of messages or printers, your source code is business logic that represents in software how your company does business. Would any developer wishing to remain on the payroll the following week dare place code in an online repository that, no matter how rigorous the access control, could be blown away by determined hackers for whatever nefarious purpose?

If you keep your logic innocuous or sufficiently generic (such as using hosted services like Zoho or Bungee Connect), developing online may be fine (we’ll probably get hate mail on that). Otherwise, it shouldn’t be surprising that no ALM vendor has yet or is likely to place code-heavy IDEs or source code control systems online. OK, Mozilla has opened the Bespin project, but just because you could write code online doesn’t mean you should.

Conversely, anything that is resource-intensive, like performance testing, does well with the cloud because, unless you’re a software vendor, you don’t produce major software releases constantly. You need lots of resource occasionally to load and performance test those apps (which by that point, their code is compiled anyway). That’s a great use of the cloud, as HP’s Mercury has been doing since around 2001.

Similarly, anything having to do with the social or collaboration aspects of software development lent themselves well to the cloud. Project management, scheduling, task lists, requirements, and defect management all suit themselves well as these are at core group functions where communications is essential to keeping projects in sync and all members of the team – wherever they are located — on literally the same page. Of course, there is a huge caveat here – if your company designs embedded software that goes into products, it is not a good candidate for the cloud: imagine getting a hold of Apple’s project plans for the next version of the iPhone.

This guest post comes courtesy of Tony Baer's OnStrategies blog . Tony is a senior analyst at Ovum. His profile is here. You can reach him here.

Wednesday, April 22, 2009

Progress gives CEP a performance boost with multi-core support on Apama

Progress Software this week announced the release of an enhanced Parallel Correlator for its Apama Complex Event Processing (CEP) platform so it can take advantage of multi-core, multi-processor hardware.

Progress claims a seven-fold increase in CEP performance on an eight-core server in the company’s internal benchmark testing of this version of the Parallel Correlator in what the company described as “real-world customer scenarios.” [Disclosure: Progress is a sponsor of BriefingsDirect podcasts.]

John Bates, who founded Apama in 1999 to build out technology based on his research at Cambridge University in the UK, believes CEP is an easier technology sell to business users than service oriented architecture (SOA) because a clear case can be made for the ability of a product like Apama to execute high-speed transactions based on the identification of millisecond movements in the business enivronment. It can also provide business managers and executives with split-second snapshots of how they are doing in their markets.

I guess we can think of CEP as SOA for high octane business intelligence (BI) for transactional and real-time insights and inferences from tremendously complex and often massive streams of services (and more). Mostly traditional BI comes from read-only agglomerations of fairly static SQL data, some of which needs a lot of handholding before it gives up its analytics gems.

Incidentally, I also spoke this week with Cory Isacsson at CodeFutures, busy at the MySQL show, who has a lot to say these days about database sharding and how applying it to OSS databases like MySQL gets, among other things, more BI love from transactional read/write SQL data. More here.

Back to CEP ... This is being newly perceived by some as much more tangible to business users than the more nerdy benefits of SOA, such as reuse of services for more agile programming of new applications. Talk about the benefits of CEP and business users eye light up. Talk about the benefits of SOA and even business process management (BPM) and their eyes can glaze over.

I should point out that my buddy at ActiveEndpoints Alex Neihaus (another disclosure on their sponsorship of BriefingsDirect podcasts) would argue that CEP and SOA are the real somnolence inducers, and that BPM and visual orchestration form the far better point on the business value arrow around service swarms. Talk among yourselves ...

In making the latest Apama announcement, Progress touts an IDC report on CEP (excerpts) that included evaluation of the 2008 version of the Apama platform. IDC gave the Progress product high ratings in the categories of “Low Latency,” the speed of event processing, “Business User Control,” how it works for the business people, and “Deterministic Behavior,” the predictability and repeatability of the event processing programs.

Lo, and although it is not mentioned in the Progress announcement, Apama did not get such high scores in the two other IDC categories, “Data Management,” and “Complex Event Detection.”

IDC does non-metaphysical squares, rather than Magic Quadrants, we should gather.

In the real world, the major market for CEP appears the beleaguered financial services industry and the government watchdog agencies that are overseeing them. This appears reflected in the Apama customers listed in this week’s announcement, including JP Morgan, Deutsche Bank, and FSA (Financial Services Authority) of the UK.

Written in the midst of this recession, the IDC report worries: "Because Apama is so closely identified with the financial markets, the current downturn is likely to negatively impact Apama's opportunity and growth prospects in the near term. Therefore, it is incumbent that Progress figure out how to cost effectively apply the technology to new markets with better short-term growth prospects."

At the beginning of last fall’s financial system meltdown, Bates told a reporter that there may be a silver lining for CEP even in the midst of a banking crisis. He foresees potential for greater use of CEP by both government regulators as well as the financial institutions that need to supply more and more detailed data to show how they are complying with new regulations now being formulated, as well as old regulations now being more rigorously enforced.

Too bad they can't apply it to card counting or my wife's algorithmic-rich shuffling of copius coupons for generating a simple groceries list. Just start with the old one, I keep telling her.

Other industries that both Progress and IDC agree might provide new markets for CEP include transportation and inventory control systems based on RFID, and ERP systems for manufacturing. I continue to be intrigued too by mobile commerce (Google Voice, anyone?), laced with locations services and other varibles like weather.

CEP is going to advance the competitive capabilities for a lot of companies. What's less clear is how they will manage that along with their BI, SOA, cloud, and other must do somedays on the IT groceries list.

Rich Seeley provided research and editorial assistance to BriefingsDirect on this blog. He can be reached at Writer4Hire.

Monday, April 20, 2009

TIBCO CEO worries about Oracle-Sun deal's impact on IT industry

BriefingsDirect contributor Rich Seeley interviewed Vivek Ranadive, CEO of TIBCO Software, on the day the Oracle-Sun proposed deal was announced. Here's his report.

Will there be confusion and even fear in the Java community? Can Microsoft take advantage of that? Will there be disruption in the hardware server business that works to the advantage of Cisco? Vivek Ranadive, CEO of TIBCO Software, sees a lot of question marks around Oracle’s proposed acquisition of Sun Microsystems.

“I’m sure there’s nervousness in the Java community,” Ranadive said in an exclusive interview with BriefingsDirect. “Can they trust [Oracle Chairman and CEO] Larry Ellison? What’s he going to do with this control? Is he going to manipulate Java so he gets an advantage? Is he going to make it less open? Is he going to find ways to start charging customers for it? There are a lot of question marks.”

[Disclosure: TIBCO is a sponsor of BriefingsDirect podcasts.]

Sun, as a hardware company, was committed to open source and Java, in the TIBCO CEO’s view, whereas Oracle is a software company “that has been notorious at exacting money from customers.”

In Ranadive's view Microsoft is a beneficiary of any fear, uncertainty and doubt about the future of Java and open source.

“It helps Microsoft,” Ranadive said. “If you’re a customer and you’re wondering about Java, you might just say the heck with it, I’ll go with Microsoft.”

Microsoft also has a cloud computing initiative while Oracle has been reticent, he noted.

“Larry Ellison has been on record as saying he doesn’t believe in the cloud,” Ranadive said, “whereas Microsoft jumped on the virtualization bandwagon and is going to head up the parade on that.”

While TIBCO as a middleware vendor maintains “Swiss-like neutrality” between the Java and .NET worlds, Ranadive said he has been impressed with the cloud technology coming out of Redmond.

Noting that in the midst of the recession TIBCO continues to report record earnings, Ranadive said that in an IT market where Java and .NET co-exist in many shops, his company is positioned as the “trusted arbiter in the middle.”

Using the example of the suitor Sun spurned, he notes the TIBCO competes with IBM, works with IBM and runs on IBM servers.

Ranadive said he is not concerned about what Oracle will do with the Sun hardware servers. But he noted that the “plot thickens” for the other server vendors including IBM, HP, Dell, and now Cisco.

“We don’t know what is happening with the server business,” he said. “Is Oracle going to keep it? Are they going to shut it down? Is Oracle going into the hardware business?”

But he sees Cisco possibly benefiting from any disruption in the hardware market.

“We have customers that are looking at Sun, and they may look at this and say maybe we’ll go with Cisco,” he speculated.

Ranadive also lists his own company as a beneficiary from whatever disruption arises from Oracle’s latest acquisition.

“Our customers are already rebelling against putting more eggs in the Oracle basket,” he said. “As the Swiss-neutral party that integrates everything with everything, it helps us a great deal.”

Noting that he got his own start in the businesses with a workstation borrowed from Sun, Ranadive did say he was sorry for the loss of what Sun once represented as the home of the innovators who created Java. But he doesn’t believe Silicon Valley has lost its innovative spirit.

“The innovators will show up at other companies, including ours,” he said. “Innovation is here to stay in the Valley.”

On the lighter side, the TIBCO CEO was speculating that Ellison might acquire Best Buy next.

“He seems to like commodity products that can be accretive,” Ranadive quipped. “So maybe Best Buy will be his next purchase. He could even start selling servers in his shop.”

In a more serious analogy, Ranadive noted that the accretive model has been used before in the software business, most notably by Computer Associates, which steadily acquired mainframe software companies in the 1980s and the 1990s.

“Oracle has become the CA of the present era,” he said.

Rich Seeley provides research and editorial assistance to BriefingsDirect. He can be reached at Writer4Hire.